A SHORT HISTORY OF THE CDC
The United States needed a CDC from the beginning but didn’t know it. The country muddled through smallpox, cholera, malaria, and yellow fever without the national coordination that could have improved its response.
There are three essentials for good public health programs. The first is the conviction that the basis for public health is to achieve health equity; therefore, the bottom line is social justice in health. Second is the understanding that the science base is epidemiology. It is epidemiology that determines the gaps in social justice, identifies the groups with poor health outcomes, discovers the details of disease causation, and provides clues to how corrective action might improve health. The third essential is the need for good management for efficient implementation of corrective actions. I will assume that the first is a given. This chapter describes the development of epidemiology and management at the CDC.
Until 1950, not a single national surveillance program existed for any disease. While the principles of epidemiology were practiced wherever public health programs were implemented, no national leadership in epidemiology existed that states and counties could easily call on in times of need. That all changed in a relatively short period of time.
John Snow is often called the Father of Epidemiology because of his study, published in 1849, in which he concluded that cholera was being spread by one of the providers of water in London and then pinpointed the Broad Street pump as a source of the problem. This was before the germ theory of disease was understood, so Snow’s conclusion was based on what we now regard as epidemiologic evidence. It must have seemed flimsy to many, but when Snow removed the handle from the pump, the outbreak abated.
It was an important moment in the history of epidemiology. However, even before this episode Ignaz Semmelweiss, working at the Vienna General Hospital, was puzzled that the First Clinic, managed by doctors and his responsibility, had a higher rate of maternal mortality than the Second Clinic, managed by midwives. Puerperal fever following childbirth was usually the cause of death. Also known as “child bed fever,” this infection is usually caused by Staphylococcus or Streptococcus organisms entering the uterus and then becoming systemic infections. Assigned to the obstetrical unit in July 1846, Semmelweiss began to ponder the different rates between the two clinics. He discovered that the differential risks were well known outside the hospital. The public understood that admissions were scheduled for twenty-four hours on one unit followed by twenty-four hours on the other unit. Women would attempt to either wait for admission, as long as possible during labor, or try for early admission in order to be admitted to the Second Clinic because of the lower risk of death (1).
The breakthrough in Semmelweiss’s thinking came when a friend, Jakob Kolletschka, died after an accidental scalpel wound he incurred while conducting an autopsy. His symptoms resembled puerperal fever, the illness affecting women after childbirth. Semmelweiss concluded that doctors, who often conducted autopsies first thing in the morning, were transmitting something from the autopsy to the delivery room. Submicroscopic germs were not yet understood, but Semmelweiss was correct, and he instituted a handwashing policy, using a solution that contained calcium hypochlorite, between autopsy sessions and delivering babies. Mortality rates dropped from 18.3 percent in the First Clinic in April 1847 to 2.2 percent in June. This phenomenal breakthrough in observation was followed by the determination and interpretation of rates, the introduction of an intervention, and the subsequent saving of lives. So perhaps it would be appropriate to designate Semmelweiss as the Father of Epidemiology.
However, advances in epidemiology occurred even earlier than Semmelweiss’s contributions. Oliver Wendell Holmes Sr., born in 1809, the same year as Abraham Lincoln, switched from law to medicine and spent formative years in Paris under the mentorship of Dr. Pierre Charles Alexandre Louis. Dr. Louis was an early clinical epidemiologist (which means he learned the subject from someone, despite the frequent assertion that Snow was the Father of Epidemiology), who demonstrated that bloodletting did not work. How many people were killed by draining blood is unknown, but finally Louis subjected the idea to study and found it wanting. He had a profound influence on Holmes, and Holmes published a paper in 1843 titled “The Contagiousness of Puerperal Fever” (2). Holmes postulated that physicians’ unclean hands spread puerperal fever to patients. Hence, we have a new contender for the title of Father of Epidemiology—Holmes.
A half-century before Holmes’s paper, Dr. Alexander Gordon published the Treatise on the Epidemic of Puerperal Fever (1795) (3). In his publication, he warns that the condition was transmitted from one case to another by midwives and doctors. He wrote, “It is a disagreeable declaration for me to mention, that I myself was the means of carrying the infection to a great number of women.”
As Gordon was observing and writing his paper on puerperal fever, Edward Jenner was studying what happened to milkmaids during smallpox outbreaks. Jenner’s mentor, John Hunter, had suggested the topic, and a milkmaid, now lost to history, had said to Jenner that she was protected from smallpox because she had experienced cowpox on her hands from milking cows. Jenner studied the experience of milkmaids who had acquired cowpox lesions in this manner and documented their protection from smallpox during subsequent smallpox outbreaks. As mentioned earlier, in 1796, he demonstrated the ability to provide this protection intentionally (4) by transmitting material from the cowpox lesion of a milkmaid, Sarah Nelmes, to a boy, James Phipps. Phipps developed a sore at the site of injection, and weeks later he was protected from a deliberate attempt to infect him with the smallpox virus.
The lesson may be that there is no true Father or Mother of Epidemiology. Rather, making observations to predict the future is as old as hunters, who improved their hunting success by observing and developing the equivalent of rates to compare the likelihood of success in different geographic areas and using various methods of hunting. Likewise, gatherers remembered the chances of finding roots or berries based on terrain, weather, and time of year. Rates were the unrecognized bottom line for those observations.
The people mentioned were all important in the history of epidemiology. But many other events, some known, most unknown, have improved the field of epidemiology.
One event was the establishment of the first school of public health in the United States in 1916 at Johns Hopkins University. Funded by a grant from the Rockefeller Foundation, it included the secondment of Wade Hampton Frost by the US Public Health Service to start a Department of Epidemiology. His experience with yellow fever, influenza, and tuberculosis provided real-world experiences in applied epidemiology. Frost later became the dean of the school.
The legacy of applied epidemiology was passed on to Alexander Langmuir, who received his training at Johns Hopkins before moving to the CDC in 1949. At the CDC, he encountered a workforce shaped by the war effort, the need to advise on tropical diseases in returning troops, and a new fear, the threat of an enemy using microorganisms for biological warfare. This was to lead to his greatest contribution: the corps of medical detectives known as the Epidemic Intelligence Service (EIS).
Malaria
When Langmuir arrived in 1949, the CDC was only three years old, having evolved from the Malaria Control Program, which started in 1942. The United States, in 1942, was rapidly gearing up to confront the demands of two simultaneous wars, one in Europe to meet the aggression of Germany, the other in the Pacific to counter the Japanese attack on Pearl Harbor. Military training camps were found throughout the country, but especially in the South, where temperate weather permitted more days of outdoor training.
The price for better weather was malaria. Malaria reduced recruit training time. Trainees from other parts of the United States had never been exposed to the disease. The United States responded with a malaria control program, headquartered in Atlanta, Georgia. One clear objective was the establishment of a one-mile malaria-free zone around every military base. Thousands of workers drained swamps, used insecticides, and learned about tropical diseases. And time lost to illness among the recruits was reduced.
The Malaria Control Program became a brain trust for malaria research and other tropical diseases plaguing troops in the Pacific and after their return home. At the beginning of the war, Dr. Joseph Mountin pushed to protect the 600 or so military bases from malaria. Mountin became a legend in public health, working in the US Public Health Service during World War I and continuing until his early death at age 61 in 1952. He became a passionate promoter of everything from chronic disease control to infectious disease programs and even to polar health. At the end of World War II, Mountin foresaw the need for a strong national center for communicable diseases. He successfully kept this unusual group of scientists together with a mandate that expanded to include the study and control of communicable diseases in general. The program became the Communicable Disease Center (CDC) on July 1, 1946.
Mountin was a man of unusual vision. He not only saw the future of the CDC but also recognized the importance of epidemiology—hence, the arrival of Langmuir. Mountin also foresaw the role of chronic diseases in public health and initiated the Framingham Study in 1948. Much of what we know about heart disease and the role of diet, exercise, and medications comes from this continuing, long-term longitudinal study of a single community. Mountin may not have anticipated it, but the CDC eventually expanded its scope from infectious diseases to the broad spectrum of public health, including chronic diseases.
Langmuir’s arrival increased the emphasis on epidemiology, which he once described as simply acquiring a numerator and a denominator, developing a rate, and acquiring enough information to accurately understand that rate. While he was a driving force in epidemiology in general, his lasting legacy was the program mentioned earlier, the EIS. An outbreak of hemorrhagic fever in troops during the Korean War underscored the immediate need for persons trained in detecting, understanding, and responding to epidemics. The United States was concerned that China had intentionally released Korean hemorrhagic fever, but we later learned that China had increased its virology programs, fearing the United States had released the virus.
In any case, a tradition was born. The dedication of the Alexander Langmuir Auditorium at the CDC in 2011 provided an opportunity both to honor Langmuir and to recall his approach. He was simultaneously respected, adored, and feared. Articles by EIS officers, intended for publication in the medical literature, were sent back to subordinates by Langmuir time after time for revisions in order “to get it right.” The demands were balanced by his personal interest in the EIS officers. After presenting at the annual EIS conference (where current officers presented interesting outbreak investigations and were then subjected to a question-and-answer session), most officers shared the anxiety of having Langmuir stand to ask a question that went to the heart of the issue and had often been overlooked by the presenter.
The approach worked, and the eagerness to please Langmuir caused the EIS officers to become professional in a very short time. At the dedication of the auditorium, one of his granddaughters was a final speaker. She related that after his retirement, Langmuir would arrange a family dinner in Atlanta every year, at the time of the EIS conference. To her he was a grandfather, and she could not even fathom that grown people at the CDC would quake in his presence.
A strong EIS became part of the genetic code that formed the CDC of today. Because it was a way to fulfill a draft obligation for young men, many more applied than could be accepted, and Langmuir could be selective in his choices. Many of those selected found the work intriguing and made a vocation of public health even though they had planned to return after their two-year stint to other careers.
The assignment of EIS officers to states, cities, and universities provided the CDC with a direct connection to the field concerns of public health. Many of those assigned continued training in public health and worked with the CDC in state, county, and university positions. A trust developed between states and the CDC that may be unique in the annals of the federal government.
Langmuir’s vision of an EIS also had an effect on the CDC itself. To support this cadre of field epidemiologists, the CDC had to upgrade its abilities and have staff of the highest caliber. It needed to truly be the gold standard for public health practice and public health laboratory capabilities. It was forced into greatness.
Over time this truth was recognized, and epidemiology was used to define and measure health problems and to suggest responses. So epidemiology became important largely because of the driving force of Langmuir.
How did management become a priority? Largely by chance. Without great forethought or planning, the CDC was blessed by the unexpected development of a managerial cadre as the result of people who were concentrating on a specific cluster of diseases rather than the CDC in general.
Following World War II, Johannes Stuart, an economist working in the Venereal Disease Division of the Public Health Service, was charged with developing a program to treat and reduce the spread of sexually transmitted diseases, especially syphilis. He organized a program that employed carefully selected college graduates, who were taught to trace the contacts of people with sexually transmitted diseases and enroll them in treatment programs. A skeptical supervisor allowed him to institute a pilot program with six young college graduates in 1948. Great diplomatic and psychological skills were required to induce people to provide information on their sexual partners. And detective skills were required to find those sexual contacts, especially if they involved one-night stands. Unusual tenacity was also required. The program was successful, and its Washington, DC, headquarters was moved to the CDC after the CDC became recognized for its work in public health.
The most successful syphilis detectives were those with the best management and people skills, and they soon attracted the attention of those in charge of other public health programs at the CDC. Over time, venereal disease workers (officially known as public health advisors) were managing vaccine, tuberculosis, and smallpox programs. Indeed, it was the experience of surveillance and containment in the field of sexually transmitted diseases that proved so useful in smallpox eradication.
William “Bill” Watson, one of the first members of the group, eventually became the deputy director of the CDC. Bill had left for war as a young and capable product of rural South Carolina. His experiences included capture by the Germans and time in a prisoner of war camp outside of Dresden. He saw the bombing of that city and the destruction made famous by Kurt Vonnegut in his book Slaughterhouse-Five. Bill wrote his own book of his experiences, entitled First Class Privates. Bill Watson was revered as a leader and manager, and when the professional managers at the CDC (i.e., the public health advisors) organized to improve mutual support, they called themselves the Watsonian Society.
Location (Location, Location)
The importance of epidemiology and management is straightforward. But why is the CDC located next to Emory University?
There is a history to everything. The CEO of Coca-Cola, Robert Woodruff, had made a commitment to have Coca-Cola available wherever US troops were stationed during World War II. This stance later provided the company with overseas bottling plants that immensely benefited global expansion. It also resulted in a friendship between Woodruff and General Dwight D. Eisenhower. In the postwar years, Woodruff became a member of President Eisenhower’s golf clique, made up of some of the most powerful men in the nation. The press referred to this group as “Ike’s Millionaires.”
Woodruff also had close connections with Emory and served on the Emory board of trustees from 1935 to 1948. Starting in 1937, Woodruff began giving money to Emory for a series of buildings (seven by 2014) and programs, and he later provided Emory with a gift of $105 million to expand and deepen its capacity in higher education.
A plantation owned by Woodruff had a continuous problem with malaria. Mountin soon became part of the Woodruff network, advising him on the latest knowledge about malaria control.
With the establishment of the CDC, Woodruff saw an opportunity to merge some of his interests. A fifteen-acre plot of land on Clifton Road, next to Emory, seemed to him the ideal place for the CDC. In 1947, Woodruff arranged to have it sold for $1 to the government and waited for his dream to be fulfilled. He was not acquainted with the long process of government planning, appropriations, and everything else involved, and he became impatient. Woodruff finally called President Eisenhower and got him on a golf course in Denver. (This, before any of us had cell phones.) He asked the president if he had paper and pencil and got an affirmative. He then said that he had given land to Emory to give to the government to build the headquarters for the CDC, but nothing was happening. Eisenhower told him he would be happy to check into it and asked for a paragraph on the details to be sent to him. Woodruff supposedly responded, “That is why I asked if you had paper and pencil.” Soon bulldozers were on the property and building began (5).
The CDC headquarters on Clifton Road became institutionalized as the only Public Health Service agency with headquarters outside of the Washington, DC, area. Its later prominence around the world as the standard for public health owes much to a succession of incredibly gifted and superb scientists, but it was undoubtedly aided by Woodruff, who was able to ensure that the CDC was a safe distance from the fumbling hands of Washington, DC’s politicians.
A scientific revolution occurred in the twentieth century, mostly in our understanding of physics. There are still mysteries to be sure, but, in general, a century after Albert Einstein’s paper on relativity, the God Particle had been demonstrated and the basic laws of physics had been established. There are no shortages of areas to be explored in understanding the universe, parallel universes, and the like, but the foundations of physics on this planet have generally been set.
In biology, however, understanding lagged by fifty to one hundred years. However, the twenty-first century will be known as the century of biology. When Francis H. Crick died at age 88, I found myself contemplating the speed of biological progress in his lifetime. It was in April 1953 that he and James Watson reported on their understanding of the double helix. I was just graduating from high school, where I was taught there were twenty-four pairs of chromosomes for humans rather than twenty-three. Crick and Watson now provided new insights into the genetic transmission of life. In the past fifty years, we have therefore gone from not even knowing how many chromosomes there are in a human to a catalog of the millions of amino acids in the first complete human genome. We have not only learned about the four-letter alphabet that contains the secrets of all species known in the history of the world, but we have also begun to write compositions using that alphabet. This permits the possibility of altering insect vectors so they cannot transmit human diseases, altering microorganisms in the development of vaccines, and providing rapid and accurate diagnostic techniques. It may even permit treatment of chronic conditions or cancers by attaching DNA segments to microorganisms that seek out the cancer or specific deficits, such as those found in muscular dystrophy. The possibilities are as challenging to understand as the universe itself.
I had the chance to meet and talk with Francis Crick at the Salk Institute in 2002, a year after I met James Watson. For a public health practitioner, these were significant events. Crick was in that elite group of geniuses who see through to solutions even outside their fields. He continued to contribute until his death. It is said that he was always gracious and humble; yet James Watson starts his book The Double Helix by saying, “I have never seen Francis Crick in a modest mood” (6). Whatever the truth, CDC owes much to these two men.
The world and the CDC would benefit in this developing scientific explosion by its close association with an academic community. However, Woodruff’s dream of a real connection between the CDC and Emory took decades to develop. While the relationship was cordial, it did not reach a lift-off stage. The development of an Emory program in community health, which matured into a school of public health, provided the vehicle. But critical mass was finally achieved when a dean of the school was recruited from the CDC. Dean Jim Curran had headed up the HIV-AIDS program at the CDC. He provided an extraordinary calming effect on this volatile problem and strolled with confidence and deliberation through the minefields. Clinical medicine, public health, and medical research all saw themselves as the owners of the problem. Jim could converse with all of them. The gay and straight communities were often pitted against each other, but he was accepted by both. Friction between gay communities and communities of color provided some of the toughest problems in the late 1980s, but in his willingness to engage all, Dr. Curran was seen as a leader.
The search committee that recommended Dr. Curran for the position recognized his qualities, but not all were comfortable with a CDC person at Emory. A few resented the fact that the CDC would often get national or international publicity. But when the decision was made, no one ever looked back with regret. The Emory School of Public Health, at a very young age, became one of the leaders in public health education and then in global health education. During the Ebola outbreak in West Africa in 2015, the value of this relationship was highlighted. Twelve years earlier, the CDC had provided Emory with funds to develop a treatment center capable of treating diseases such as Ebola. When the time came for treating volunteers who had become infected in Africa, Emory was prepared to provide superb care. In addition, the Emory Eye Center deployed experts to West Africa to care for Ebola survivors with eye complications of Ebola.
Woodruff would be very pleased.