CHAPTER TWO

Disposing of Degenerates

THE NOTION THAT INSANITY was somehow the product of an interaction between a vaguely defined inherited constitution and the environmental pressures of daily life had long been a cliché among those who sought to understand the origins of mental illness—and, indeed, of illness more generally. That acquired characteristics, even patterns of behavior, could be inherited—a view that came to be formally associated with the French biologist Jean-Baptiste Lamarck—was widely accepted, and these beliefs would continue to be held by the public at large into the early twentieth century. The biological constitution that parents handed on to their children was held to be the unique and peculiar combination of their own biological inheritances and the experiences, habits, and modes of life they had adopted over their lifetimes. Intemperance, slothfulness, immorality, poor diet, indulgence of the baser emotions, overstrain, and overwork—all these and more could have a deleterious effect on one’s offspring, giving rise to what was referred to as a “diathetic weakness” that produced lifelong troubles. Mothers in particular were thought to bear a heavy responsibility for the quality of the offspring, and they were expected to nurture their fetuses (and infants) both morally and physically—injunctions that provided a “scientific” warrant for the enforcement of Victorian limitations on women’s lives.

As the gloom attending the failures of asylum doctors spread, a new account of the roots of insanity emerged that simultaneously explained away their apparent therapeutic impotence and provided an alternative rationale for the isolation and containment of the insane in vast institutions. First articulated by the French alienist B. A. Morel in his Traité des dégénérescences in 1857, the new doctrine rapidly secured converts across Europe and then spread to America.

The first generation of asylum doctors had argued that insanity was a catastrophe that could potentially be visited on anyone. Indeed, in some respects, the prominent and successful members of society had particular reasons to fear this affliction, for they had the finest and most refined nervous systems, stretched tight as their ambitions placed greater strains on their brains. In the last third of the nineteenth century, psychiatrists increasingly adopted the directly contrary view, insisting that insanity was a symptom of degeneration and biological inferiority, a form of evolution run in reverse to be found for the most part among “the dregs of society.” Where madness had once been a disease of civilization, now it was held to mark its absence, the return to a lower state of being of an almost animalistic sort.1

In the hands of those developing the idea of degeneration, the influence of hereditary factors assumed a far darker and more deterministic form. Americans of earlier generations had bought William Buchan’s Domestic Medicine in vast quantities between 1775 and 1830, finding their beliefs about the importance of heredity repeatedly reinforced as they sought his advice on how to maintain their health. But Buchan’s message could be read optimistically, even as he warned of the consequences of dissipation. “Family constitutions,” he proclaimed, “are as capable of improvement as family estates”—provided one lived a life of prudence and discretion. To be sure, the contrary was equally true: “the libertine who impairs the one, does greater injury to his posterity, than the prodigal, who squanders away the other.” Progress ought to be the order of the day. Hence, odd as it might seem, for Americans of the Jacksonian Age, a belief in heredity and an optimistic attitude toward the possibilities of improving the human race went hand in hand.2

Not so in the closing decades of the nineteenth century, especially with respect to the burgeoning troublesome classes, among whom the insane and feeble-minded (a group that substantially overlapped) loomed large. Charles Darwin’s ideas about evolution, however controversial at first, had popularized the notion of the survival of the fittest as the indispensable engine of progress. But as Darwin’s cousin Francis Galton argued, in civilized societies the operation of this natural law was often suspended, prompting overbreeding by the less fit and underbreeding by the more thoughtful and conscientious, an observation that led him to embrace eugenics.3 It was a theme Darwin himself took up in The Descent of Man and Selection in Relation to Sex, published in 1871:

With savages the weak in body or mind are soon eliminated; and those that survive commonly exhibit a vigorous state of health. We civilized men, on the other hand, do our utmost to check the process of elimination; we build asylums for the imbecile, the maimed, and the sick; we institute poor laws; and our medical men exert their utmost skill to save the life of everyone to the last moment. Thus the weak members of civilised societies propagate their kind. No one who has attended to the breeding of domestic animals will doubt that this must be highly injurious to the race of man.4

If one accepted this line of reasoning, its implications for those charged with running asylums and other institutions designed to cope with the dangerous and defective seemed obvious. Left to its own devices, a ruthless Nature might be expected to eliminate the diseased and defective. But since a misplaced charity had intervened to slow or even reverse the inexorable pathway to extinction, they threatened to return society’s kindness by reproducing their kind in ever-larger numbers.5 Quite what the hereditarian mechanism might be that produced such an outcome remained undefined, but that defects were passed from one generation to the next, and that this defective germ plasm was the primary explanation for feeble-mindedness, crime—and soon enough the whole panoply of social pathologies that afflicted late nineteenth-century societies—was accepted by experts and laymen alike. By the closing decades of the nineteenth century, what had once been the speculations of isolated asylum doctors acquired a broader cultural significance. Critics of those toiling in the asylums agreed with alienists on this point. As the New York neurologist Edward Spitzka put it, the “most important predisposing cause of insanity is undoubtedly hereditary transmission of structural and physiological defects of the central nervous apparatus.”6

Biological explanations of criminality, most famously advanced in the work of the Italian phrenologist and criminal psychologist Cesare Lombroso, were widespread. Lombroso argued that the propensity to violence and crime was visible in a carefully cataloged variety of malformations of the body: long arms, asymmetrical heads, sloping foreheads, projecting jaws. These defects were thought to match the impulsive behavior, vindictiveness, and absence of remorse and civilized morality among the lower orders, for they were two sides of the same coin.7 As the notion of deviants as evolutionary throwbacks gained broad acceptance, it became linked in the United States to rising nativist feelings.

Morel and his French colleagues were the first to proclaim the degenerate “a morbid deviation from the primitive human type.”8 Hysteria, epilepsy, and nervous disorders of all types, ranging all the way to insanity, were increasingly held to constitute a single class of afflictions. The epileptic and insane were thought to recapitulate in their bodies the pathologies of past generations, and to risk contaminating those around them. As Morel put it, as early as 1857, “The degenerate human being, if he is abandoned to himself, falls into a progressive degeneration. He becomes not only incapable of forming part of the chain of progress in human society, he is the greatest obstacle to this progress through his contact with the healthy part of the population.”9

Morel’s doctrines fell on receptive ears. There were soon mutterings about the need to penetrate “the mystery of the propagation of hereditary diatheses” so that medicine might move “towards purging the blood of races of living poisons.”10 In a France recently defeated by Prussia, suffering from a sharply declining birth rate and a pressing sense of national decline, the hereditary account of mental disorder had become entrenched. Defective biology was invoked to explain a whole host of putatively closely interrelated forms of pathology: crime, drunkenness, epilepsy, and hysteria, with madness and feeble-mindedness the final way station on the road to extinction.

In Germany, as early as the 1840s, Wilhelm Griesinger had introduced the notion that mental illness was brain disease. His successors were even more enamored of biological reductionism. German psychiatrists were wedded to laboratory research on the brains of deceased mental patients, ensconced in university clinics that kept them at a safe distance from the custodial realities of an asylum system that provided them with an endless supply of anatomical specimens. Almost to a man, they enthusiastically embraced degenerationist doctrines. Their convictions did not waver in the least when their minute histological studies failed to find the evidence of degeneration that their theories assured them must be there. As for living patients, they were completely uninterested in them. Convinced that insanity was an incurable disorder, they disdained those who suffered from it.


TO THE EXTENT THAT AMERICAN psychiatry drew on foreign sources in developing its own notions of the connections between mental illness and degeneration, it was to British counterparts that it turned—most notably to the work of the misanthropic but enormously influential Henry Maudsley; and to Charles Mercier, Daniel Hack Tuke, and the Edinburgh alienist Thomas Clouston. Their textbooks were those that American asylum doctors consulted, if they troubled with book learning at all. In the absence of a substantial American text, publishers promptly made US editions of their books available.

For Maudsley, anatomy was destiny. Mental illness “has its foundation in a definitive physical cause,” he wrote in The Physiology and Pathology of Mind, and the madman “is the necessary organic consequence of certain organic antecedents: and it is impossible that he should escape the tyranny of his organization.”11 His (or her) defects were written upon the body, in “malformations of the external ear tics, grimaces, or other spasmodic movements of the face, eyelids or lips stammering and defects of pronunciation.” The eyes were notable for “a vacantly-abstracted, or half-fearful, half-suspicious, and distrustful look. These marks are, I believe, the outward and visible signs of an inward and invisible peculiarity of cerebral organization.”12

Science, Maudsley claimed, had demonstrated that madness was the penalty to be paid for vice and immorality. “The so-called moral laws are laws of nature,” he wrote, which men “cannot break, any more than they can break physical laws, without avenging consequences. As surely as the raindrop is formed and falls in obedience to physical law, so surely do causality and law reign in the production and distribution of morality and immorality on earth.” Excess of any sort threatened the mental integrity of future generations, and “the wicked man” must be brought “to realize distinctly that his children and his children’s children will be the heirs of his iniquities.”13

Moral causes wrought physical changes on people’s bodies, and these degenerative modifications were successively “transmitted as evil heritages to future generations: the acquired ill of the parent becomes the inborn infirmity of the offspring. It is not that the child necessarily inherits the particular disease of the parent but it does inherit a constitution in which there is a certain aptitude to some kind of morbid degeneration an organic infirmity which shall be determined in its special morbid manifestations according to the external conditions of life.”14 Tuke, whose great-grandfather had founded the York Retreat in 1796, the institution that inspired the construction of reformed asylums in the English-speaking world, was blunter still: “Recklessness, drunkenness, poverty, misery characterise the class,” he insisted. “No wonder that from such a source spring the hopelessly incurable lunatics who crowd pauper asylums, to the horror of the [tax]payers.” They were most emphatically “an infirm type of humanity. ‘No good’ is plainly inscribed on their foreheads.”15

The implication was that it was positively misguided to attempt to cure the mad and restore them to society. Such apparently humane and well-intentioned efforts “prevent, so far as is possible, the operation of those laws which weed out and exterminate the diseased and otherwise unfit in every grade of natural life.” Irrationally, the insane “are not only permitted, but are aided by every device known to science to propagate their kind.” They are “turned loose to act as parents to the next generation centres of infection deliberately laid down, and yet we marvel that nervous disease increases.”16

The embrace of a rigid biological determinism, while it appeared to leave little scope for positive intervention, had the compensating virtue of explaining away the profession’s dismal therapeutic performance. Far from lamenting their failure to cure the mad, the citizenry should see it as a blessing in disguise, “for no human power can eradicate from insanity its terrible hereditary nature, and every so-called ‘cure’ in one generation will be inclined to increase the tale of lunacy in the next. [I]t is evident that the higher the percentage of recoveries in the present, the greater will be the proportion of recoveries in the future.”17 Asylums thus had a vital role to play in warding off the evils of insanity, even if it was by providing custody rather than a cure.


WELL-TO-DO AMERICANS HAD NEVER regarded the asylum with much favor. Hospitals were for the poor and friendless, not those of means, an attitude that only began to change once the combination of anesthesia and the aseptic revolution in surgery brought a change in the hospital’s reputation, and a therapeutic advantage to being treated in an in-patient setting. Still, if general hospitals were becoming essential to the practice of advanced medicine, the same could not be said of the asylum. The stigma attached to certification as a lunatic made confinement of a near relation in such places a potent source of shame, and inflicted serious damage on one’s social standing.

Those who possessed the resources preferred to cope with mental infirmity in domestic surroundings, consulting neurologists or other nerve doctors. In the aftermath of the Civil War, the newly emerging specialty of neurology had found that many of the patients crowding into their waiting rooms complained of a variety of nervous disorders. Some were veterans of combat who carried no signs of physical pathology. Others were society matrons (or their daughters) or captains of industry suffering from the ancient disease of hysteria or the newly fashionable disorder of neurasthenia. They turned to nerve doctors after general practitioners had failed them.

Victims of train accidents soon joined them, and “railway spine” became part of an expanding lexicon of functional disorders—complaints that manifested themselves via physical symptoms but seemed at odds with what anatomy and physiology taught about the makeup of the body. Nervous patients not disturbed enough to require seclusion in an asylum created a demand for a new kind of nerve doctor, and neurologists willing to accept the challenge (and the income) soon found their ranks augmented by refugees from the asylum sector. A heterogeneous and disorganized group, these practitioners existed alongside (though at a considerable intellectual remove) from traditional alienists.

To treat their patients’ nervous prostration, these doctors invented an enormous variety of nerve tonics whose ingredients remained trade secrets. Many contained dangerous substances, some of which proved addictive: strychnine, morphine, cocaine, and lithium salts, for example. Others proffered animal extracts designed to fortify the nerves.18 Hydrotherapy was another treatment they frequently employed; while in their consulting rooms, patients connected to elaborate machines received jolts of electricity to provide painful stimuli to the body. Electrotherapy had been popular in some circles from the eighteenth century onward, but for many it had the odor of the charlatan and the quack. As neurologists began to demonstrate the role of electricity in transmitting nervous impulses, however, it was not difficult to persuade themselves and their clientele that electrical charges had healing powers.19 Manufacturers rushed to fill the void.20

The shocks were painful, of course, and were particularly touted as remedies for one of the commonest symptoms of hysteria, a loss or impediment of speech, or hysterical aphonia, where electrodes could directly be applied to the larynx. Few remained mute for long when subjected to these interventions.21 More generally, neurasthenics and hysterics were informed that overuse and overstrain had disturbed the equilibrium of the nervous system. A commonly employed metaphor was that overstimulation and overuse had run down one’s battery, an image that suggested why the application of electricity could have therapeutic effects. Within a generation, though, having originally insisted that electrical treatment worked directly on the nervous system, there was a growing consensus that its efficacy, such as it was, rested on the power of suggestion.

The most famous therapy the nerve doctors employed was Silas Weir Mitchell’s “rest cure.” Like most of his fellow neurologists, Weir Mitchell attributed the debility of his nervous patients to the wear on their systems caused by the pace of modern life. Wear and Tear: Or, Hints for the Overworked, was the title of one of his monographs intended for popular consumption. The title of a subsequent volume in 1877 hinted at the solution: Fat and Blood.22 The book’s (and the treatment’s) popularity is suggested by the appearance of an eighth edition as late as 1911. Those subjected to the rest cure included the novelists Charlotte Perkins Gilman and Virginia Woolf, both of whom wrote devastating fictional accounts of the experience.23 The rest cure lasted six or eight weeks, sometimes longer, all designed to build up the “fat and blood” of the patient. Prescribed a high-calorie diet and subjected to forced bed rest, patients were isolated and denied access to any and all forms of psychical and mental stimulation, save for the electrotherapy, massage, and hydrotherapy that were prescribed as part of the treatment. It was a treatment only available to the well-to-do, and its primary “beneficiaries” were overwhelmingly women. Presumably some pronounced themselves better, and certainly nervous invalids from near and far flocked to Philadelphia to be treated by Weir Mitchell. For Gilman and Woolf, the experience practically drove them mad. As for men, Weir Mitchell prescribed virtually the opposite regimen. Dubbed the “West cure,” it sent neurasthenic men out West to rope cattle, hunt, and compete in a sturdy contest with Nature. Thomas Eakins, Theodore Roosevelt, and Walt Whitman were among those who tried it.

It is little wonder that modern feminists have denounced the rest cure as a prototype of Victorian patriarchal oppression. Certainly, these gender-based curative regimes uncannily echo the stereotypes of the broader culture of the time: women subjected to a hyperexaggerated version of domesticity, and men sent forth to strengthen nervous systems that their sedentary lives had enfeebled.24 By the early twentieth century, these earlier interventions were gradually giving way to various forms of psychotherapeutics.


EVEN AMONG THE RICH, however, many found their mentally disturbed relations too disruptive, too wearying, or too dangerous to themselves or others to be allowed to remain at large. From the earliest years of the asylum era, a handful of institutions had emerged whose clientele came from more affluent social strata than those who crowded the halls of the state hospitals. To be rich and mad might require shelter not just from the gaze of the world, but also from any intimate association with what the Massachusetts authorities called the “odor of pauperism.”25

At the McLean Asylum in Boston, the Butler Hospital in Rhode Island, and the Hartford Retreat in Connecticut, as well as at the Bloomingdale in New York and the Pennsylvania Hospital for the Insane in Philadelphia, continuous efforts were under way to compete for a limited number of wealthy patients whose families could choose where their relations would be confined. Though secure confinement of the mad might be the key requirement for such an establishment, it had the potential to alienate the asylum’s true clientele, the patients’ families, and so these features had to be disguised beneath a veneer of good taste and cheerfulness. Manicured grounds could play a vital role in massaging the impressions of families faced with the difficult task of confining their nearest and dearest.26

The Hartford Retreat, which opened its doors in 1824, initially admitted a mix of public and private patients, there being no other asylum in the state. But once Connecticut constructed its first state asylum in 1858, the Retreat rapidly upgraded its facilities to attract wealthy patients. In 1860, it hired Frederick Law Olmsted to landscape its thirty-five acres of grounds (Olmsted also submitted a design for the campus of its rival, the McLean—though it was never carried out—and sadly ended his days as a patient there). After the Civil War, the Hartford Retreat began a rapid program to redesign and upgrade its physical fabric, aided by the fact that the war, and the fortuitous presence of the Colt armaments factory, had turned Hartford into the richest city in the country.

If opulent surroundings did not suffice, families could be reminded that the asylum provided a range of other amenities—French lessons, drawing classes, singing classes, theater, and the like. Staffing levels were high—the McLean boasted that “its sane population is about half as numerous as the insane patients”—and patient numbers remained constrained. If all these amenities meant that these private establishments cost six or eight times as much as their public counterparts, their clientele—the patients’ families—were not disposed to object. In these social circles, the linkage of mental illness and degeneracy was a sensitive subject. Just how carefully these doctors felt they had to proceed is suggested by the intellectual evolution of George Alder Blumer, who had previously been one of the most vocal psychiatrists endorsing eugenic ideas.

As superintendent of the Utica State Asylum in Upstate New York, Blumer was intimately familiar with the dire conditions of his state’s hospitals. He grew increasingly frustrated at the attempted micromanagement of his institution by Carlos MacDonald, who had been appointed head of a reconstituted state commission on lunacy in 1889. MacDonald, who previously served as superintendent of three state hospitals, immediately set about enforcing uniform state standards, and in 1893 he endorsed state legislation that sharply cut hospital budgets, from $208 to $184 per patient per year. Instructions to hire a female assistant physician and limitations placed on his ability to admit and charge higher fees for private patients exacerbated Blumer’s frustration. His initial instinct was to challenge MacDonald, whom other asylum superintendents also resented for his interference. In such circles, MacDonald was increasingly viewed as a “Prince of Darkness.”27 But it soon became apparent who had the governor’s ear, and when neither a change of administration nor MacDonald’s resignation brought any relief, Blumer’s disillusionment reached the breaking point. In 1899, he accepted a new position as head of the private Butler Hospital in Rhode Island.

At first, Blumer was as vocal as ever about the links between mental illness, vice, and the biological defects that ran in families. In 1903, four years into his tenure at Butler, he was elected president of the American Medico-Psychological Association. He used the occasion of his presidential address to warn once again of the links between inherited biological defect and mental defect, speaking darkly of the “infinite disaster” that awaited society if the mad were allowed to give free rein to their instincts. He added that the insane were “notoriously addicted to matrimony and by no means satisfied with one brood of defectives.”28

The relatives of his new clientele were not shy when it came to communicating their unhappiness with such notions. It was all very well to dismiss the pauperized masses who crowded the state hospitals as degenerate fiends; not so when the blame was attached to the captains of industry and their close relations. Very quickly, Blumer found himself forced to soften and then almost repudiate his earlier views. In an attempt to soothe the anger of patients’ families, he informed them that “in these New England communities of ours, where brains are more apt to be highly organized than in less favored parts of the country, it may even be a mark of distinction to possess a mind of sufficient delicacy to invite damage under the stresses of life.”29

When the nineteenth-century neurologist George Beard had popularized the term “neurasthenia,” he had employed similar arguments to suggest that this weakness of the nerves was a condition peculiarly likely to afflict the rich and successful, whose nervous systems were more refined and delicate than those of the poor, and stretched by the stresses of their ultracivilized lifestyles. It was a well-judged piece of flattery that had salutary effects in expanding the market for these nerve doctors’ wares, and Blumer was not slow to recognize that such notions could easily be adapted to placate rich relatives of asylum inmates. He returned to the theme on later occasions, reminding them that “there cannot be complexity of the nervous system without what the world calls nervousness.”30

In private, and in communications solely intended for his fellow psychiatrists, Blumer made it clear that in reality his views had not changed at all. “Insanity is to a large extent a degeneracy,” he wrote to one correspondent in 1916; four years later, he told another to pay no mind to the “comforting view that there is less in heredity that most of us believe.”31 As for those “refined” New England elites, “Those old families have been breeding and inter-breeding ever since [they arrived in colonial times], insomuch that there are few of the old families in Rhode Island today which do not reveal the unhappy consequence in neuroses of some sort, and the evil work is still going on.”32

Most of Blumer’s colleagues worked in the ever-expanding state asylum system, and had come under little or no pressure to disguise their view that their charges were a biologically defective lot. Paid large sums to minister to rich patients, those who ran asylums for the well-to-do were keener than their colleagues in the public sector to seize upon innovations that promised cures. Besides, they faced great pressure from a clientele accustomed to the power of the purse, who expected results.

The sorts of patients who formed the bulk of the population of asylums like the Hartford Retreat and the McLean were by no means the only members of the well-to-do classes who came to the attention of psychiatrists. Precisely because of the stigma that attached itself so firmly to the “lunatic” or “mentally ill,” and the social erasure that accompanied commitment to an asylum or mental hospital, whenever it proved possible—unless management difficulties or the threat of scandal grew too powerful—the wealthy sought alternatives.


PRIVATE ESTABLISHMENTS FOR THE NERVOUS proliferated in the late nineteenth century: spas for those suffering from the strains imposed by the pace of modern life, and sanitoriums for successful men of business (and their wives) to recharge their run-down nervous batteries. Drawing patients by means of discreet advertisements or word of mouth, they occupied a shadowy existence in attractive but secluded locations. The most successful of these began life as a small institution for members of one of those new varieties of Christianity that have been a recurrent feature of the American scene. The Western Health Reform Institute was set up in 1866 to serve members of the Seventh Day Adventist Church who were experiencing mental or spiritual crises. Within a few decades, it had become the largest and most famous sanitarium in the world.

The Adventists traced their origins back to the religious revival of the Second Great Awakening and to the expectations of the End Time and the Second Coming of Christ that gripped the followers of William Miller in the early 1840s. Members of the church made wholeness and health (and an emphasis on “pure” food) a central part of their daily lives. Ellen White, the Church’s spiritual adviser, had received a series of visions from angels, Christ, and God, enjoining certain dietary constraints to be adopted by the Adventists while awaiting the Second Coming. (As the historian Ronald Numbers has pointed out, she borrowed her prescriptions from other “health reformers” active at the time.)33 On Christmas Day in 1865, a new vision told her to found an institution for members of the Church, lest they fall victim to “the sophistry of the devil” in hydropathic institutions run by nonbelievers. Though the divine injunction was rapidly put into practice, the institution struggled throughout its first decade.

Two brothers who were members of the faith, John Harvey and Will Keith Kellogg, took over the floundering institution and transformed it, with John attending to the medical side, and Will to the business end. The 106 patients at what was in 1876 called the Western Health Reform Institute were swamped by the 7,006 who patronized its successor in 1906. Masters of marketing, the Kellogg brothers modified the existing word for a health resort, “sanitorium,” and renamed the establishment the Battle Creek Sanitarium, nicknamed “the San.” A disastrous fire in 1902 consumed most of the buildings. The catastrophe provided the means for the Kelloggs to separate their enterprise from the Church. They secured their own financing for its reconstruction, and expanded still further. Staff grew to 800, including thirty physicians, and Battle Creek’s treatment rooms could accommodate as many as a thousand patients at a time.

All sorts of affluent patients—“nervous” invalids, those plagued by insomnia and headaches or by a multitude of aches and pains—came to Battle Creek to reinvigorate their nervous systems and restore their health. John Harvey Kellogg introduced them to a regimen based on a cleansing vegetarian diet (meat was held to rot in the gut and poison the system), frequent enemas to cleanse the colon, abstinence from alcohol and sex, cold showers, electrotherapy and phototherapy, and plentiful open-air exercise. Whole grains, fiber, and nuts were the order of the day, often transformed into flaked cereals (the secret of which was appropriated by one of their patients, C. W. Post, much to Will’s dismay).34 Beyond this, peanut butter and vegetarian substitutes were offered for meat and fish (skallops—a fake version of the real thing—were a particular favorite). To aid in the emptying of the colon, Kellogg even designed a special toilet, lower than most and sloping backward, so that the civilized were forced to squat in a position that he claimed facilitated elimination.

Clean bowels and defecation several times a day were the route to health and happiness, and an in-house laboratory for fecal analysis provided a scientific check on the patients’ progress. It was a formula that attracted both the great and the not-so-good: Lincoln’s widow, Mary Todd Lincoln; Amelia Earhart; Alfred Du Pont, Henry Ford, and John D. Rockefeller; presidents Warren G. Harding and William Howard Taft; Thomas Edison and the original Tarzan, Johnny Weissmuller. In the long run, though, it was the Kelloggs’ breakfast-cereal business—originally a spin-off for those unable to afford a sanitarium visit—that made them multimillionaires.35

So great was the demand for space that, in the late 1920s, John Harvey Kellogg decided to expand again, building a fourteen-story tower block across the street from the main complex. The temptation was obvious: in the years between 1927 and 1929, Battle Creek was grossing $4,000,000 a year, and making a profit of $700,000. Kellogg took out a loan for $3,000,000 to finance what turned out to be a folly. The huge profits of earlier years had largely been frittered away, and the opening of the new facility turned out to be the harbinger of financial ruin. The stock market crash of 1929 dried up the flow of new patients. Over the next four years, the anticipated profits evaporated and, ultimately, Kellogg found himself unable to pay the interest on the loan. A last-ditch appeal to the Rockefeller Foundation in 1934 met with emphatic rejection: Alan Gregg, the director of medical sciences, noted that “J.H. Kellogg is 83 yrs old and there is no one to succeed him—it is impossible for the RF to interfere or advise in a situation like this we should keep out.”36 Within weeks, the Sanitarium was forced to declare bankruptcy, and though it limped along in a much-reduced state for the rest of the decade, it was purchased for use as an army hospital in 1942.

Battle Creek was a special case, but a host of far smaller establishments more closely resembling the original Western Health Reform Institute were created in the closing decades of the nineteenth and early decades of the twentieth century. In many cases, physicians who had developed a nodding acquaintance with the treatment of insanity in state asylums left the depressing environment of these mausoleums of the mad and established private homes where they took in well-to-do patients not so disturbed as to require confinement in an asylum.

In the Boston suburb of Brookline, for example, Walter Channing opened his “private hospital for mental diseases” in 1879. Channing was a direct descendant of Walter Channing, an officer in the Revolutionary army, the grand-nephew of William Ellery Channing (one of the founders of Unitarianism), and eldest son of the poet William Ellery Channing—establishment credentials that must have stood him in good stead when it came to attracting patients. Unusually for someone of his background, after finishing his training at Harvard Medical School he served as an assistant physician at New York’s Asylum for the Criminally Insane, and then as first assistant physician at the Massachusetts State Insane Hospital in Danvers.37 This gave him the practical knowledge he needed to open his own establishment.

Taking fewer than twenty-five patients, mostly women, Channing provided discreet treatment in secluded surroundings. When Brookline began to be overrun by development, he relocated his enterprise in 1916 to a large wooded estate in Wellesley. Here he offered the standard remedies “nerve doctors” relied on—hydrotherapy, massage, electrotherapy using shiny static electricity machines—coupled with a soothing environment and gentle admonitions to improve one’s self-control. If not particularly efficacious, neither were these interventions particularly challenging, and genteel invalids unfit for society could idle away their days.


FAR FROM THESE MANICURED ESTATES, those shunted off to state hospitals could expect no such tolerant treatment. As the theory of degeneracy acquired a patina of respectability, some began to argue that even lifelong confinement or involuntary sterilization were half measures, the product of a failure of nerve. Attentive readers of the Anglo-American psychiatric literature would have encountered hints that a more robust solution was worthy of consideration. “Every year,” the English alienist Samuel Strahan reminded his audience, “thousands of children are born with pedigrees that would condemn puppies to the horsepond.” Lunatics were waste products of the evolutionary process, “morbid varieties fit only for excretion.”38

On occasion, the euphemism of “excretion” or “extrusion” was dispensed with entirely. Blumer had well-thumbed copies of both Maudsley’s and Clouston’s books, and proved to be an adept disciple.39 “Our modern hospitals for the insane are in some measure responsible for the increase of insanity by promoting, not the survival of the fittest, but the survival of the unfit,” he lamented, “as well as by permitting unstable persons to leave institutions and mate themselves with their kind, instead of allowing an affinity of contrasts to prevail in selecting their wives.”40 The ancient Scots may have been condemned by modern sentimentalists for their “rough and ready method” of burying alive babies and their epileptic or mentally disturbed mothers, but “from the point of view of science the cruel and remorseless Scot was more advanced than his descendants of our day.”41

Blumer’s wistful comment was not meant as a serious policy proposal. Others were not so circumspect. The prominent New York physician W. Duncan McKim, heir to a banking fortune and contemptuous of his social inferiors, warned darkly of “the ever-strengthening torrent of defective and criminal humanity.” He urged that “a gentle and painless death” was “the most humane means” of resolving the societal problem that they presented. “This should be administered not as a punishment,” he clarified, “but as an expression of enlightened pity for the victims—too defective by nature to find true happiness in life.” Euthanasia was a remedy, he acknowledged, that “is in part a very old idea, but so modified and expanded as to differ profoundly from its prototype”: “If we view broadly the evil which these individuals engender, we find not only that it thwarts the best purposes of men, but that it lies at the very root of human misery. When we reflect upon the vast amount of wealth and affection these semi-human automata absorb it would hardly seem that we are justified in preserving them merely because of an abstract sentiment for which reason can give no warrant.” Time to be rid of “many an unworthy life,” those whose existence has “proved a curse for the race and the individual.” “The idiot and the low-grade imbecile,” he hastened to reassure his readers, “are not true men, for certain essential human elements have never entered into them, and never can; nor is the moral idiot truly a man, nor, while the sad condition lasts, the lunatic. These beings live amongst us as men, but if we reckon them as human we shall fare much as if we bargained with the dead or with beasts of prey.” They should be exterminated en masse with “carbonic gas.”42 The numbers marked out for extermination would inevitably be large at first, but they would diminish with each generation, as the defective germ plasm was eliminated.

Though McKim’s discussion reads today like a twentieth-century equivalent of Jonathan Swift’s satirical Modest Proposal, it was meant in most deadly earnest. The Nation, in a review published on November 1, 1900, recommended McKim’s book to its readers and to “all good citizens interested in human progress.” Who could doubt that the public had every right to protect itself from “the ravages, one may say the compulsory and automatic ravages, of the physically, mentally, and morally diseased [?] There seems no logical objection to the absolute removal of those whose unsoundness is complete and irremediable, particularly when they are a public charge.”43

For good measure, McKim’s publisher, Scribner’s, more than a decade and a half later brought out the popular The Passing of the Great Race, by Madison Grant, director of the Bronx Zoo and president of the Immigration Restriction League. Here the proposed “obliteration of the unfit” was extended to an assault on the “inferior races.” Grant complained that the “mistaken regard for what are believed to be divine laws and a sentimental belief in the sanctity of human life tend to prevent the elimination of defective infants and the sterilization of such adults as are themselves of no value to the community.”44 Such well-known progressives as Clarence Darrow joined in the chorus, advocating efforts to “chloroform unfit children” so as to “show them the same mercy that is shown to beasts that are no longer fit to live.”45

As a practical matter, the opposition of religious groups and constraints of a democratic polity meant that the chances of actually instituting such policies were essentially nil. Charles Davenport, the Harvard-trained biologist and member of the National Academy of Sciences, did what he could as head of the Eugenics Records Office in Cold Spring Harbor, New York, to persuade the American public to put aside its concerns, but in the end he could only lament that “it seems to be against the mores to burn any considerable part of our population.”46 It would prove otherwise, of course, in Nazi Germany where, with the enthusiastic participation of German psychiatrists, Hitler’s T-4 program would see mental patients herded into gas chambers in tens, even hundreds of thousands, “useless eaters” who became the first victims of a policy of mass extermination that would soon extend to other groups defined as subhuman and a menace to the purity of the race.47 But the harsh and condemnatory language of these authors and the insistence that mental illness was an irredeemable biological condition were not without serious consequences for American social policy.


THE THEORIES OF DEGENERACY that psychiatrists had done so much to develop and propagate were seen to justify a whole series of legislative changes. The easiest to pass were attempts to prevent marriage and reproduction among the unfit, first codified into law in Connecticut in 1895, launching the fashion for laws prohibiting the marriage of the defective. If one of the marriage partners was determined to be “unfit”—feeble-minded, epileptic, or an imbecile—both parties could be imprisoned for a term of up to three years. Statutes of this sort proved irresistible to politicians, and by the mid-1930s, as many as thirty-one states prohibited the mentally ill and “feeble-minded” from wedding.48 But if such statutes were politically popular, their practical effects were slight, and even those broadly supportive of the eugenic cause came to see them as symbolic gestures, rather than effective interventions to stem the burgeoning number of defectives.

These were years of massive immigration from southern and eastern Europe. The influx of Jews and Catholics aroused powerful nativist sentiments. Critics charged that these migrants were biologically inferior to the Nordic types who were the hope of America, and threatened the nation’s future. Worse still, these were “races” with high rates of mental illness and defect and biologically based criminal propensities. The need for restrictive immigration laws to shut out the mentally defective entirely, and to limit the influx of these inferior ethnicities, was widely discussed. Francis Walker, a prominent economist and president of the Massachusetts Institute of Technology, had written in the Atlantic in 1896 of the necessity not just of “straining out from the vast throngs of foreigners arriving at our ports [the] deaf, dumb, blind, idiotic, insane, pauper, or criminal,” but also of excluding “the tumultuous access of hordes of ignorant and brutalized peasantry from the countries of eastern and southern Europe.” These were “beaten men from beaten races; representing the worst failures in the struggle for existence”—“foul and loathsome” creatures, whose presence would serve only to degrade and debase America’s culture.49

Two years earlier, the Boston bien-pensants Charles Warren, Robert DeCourcy Ward, and Prescott Hall had founded the Immigration Restriction League. The spread of eugenic ideas among “progressive” elites combined with hostility to competition from the new migrants among the working classes ensured that the agitation to exclude those considered biologically inferior increased in the decades that followed. Ultimately it met with legislative success, a process helped along by the expert testimony of sympathetic psychiatrists and social scientists. Asian immigrants were barred in 1917, and the Immigration Act of 1924 imposed strict quotas on immigration from southern and eastern Europe, inaugurating a system that would remain in place for more than four decades.

This was not the first time questions of race and ethnicity had aroused nativist sentiment. Between 1847 and 1854, the Irish potato famine prompted a mass immigration of Catholics to a still largely Protestant country. Two-fifths of the nearly two and three-quarter million immigrants in these years were Irish Catholics, and disproportionate numbers of them soon began to show up in public mental hospitals. This pattern was particularly marked in the major cities: in Boston, New York, Philadelphia, Cincinnati, and St. Louis, where unskilled Irish immigrants clustered in urban slums. Sharing the larger culture’s distaste for the new arrivals, asylum superintendents found them “more noisy, destructive, and troublesome,” “very ignorant, uncultivated people [who] from some cause or another, seldom recover.”50 There was talk of establishing separate institutions for them, and a frequent resort to segregating them into separate wards from those that housed the native-born.51

If the prospect of mingling Irish and other foreign-born patients with the native-born was problematic, mixing white and Black lunatics was unthinkable. In the pre–Civil War North, the African American population was small, those who applied were often turned away, and the few who were admitted to asylums were simply placed in segregated cells. In the South, slaves were largely excluded from the asylums, except where their owners were willing to pay for them, in which case they were separately provided for in grossly inferior accommodation. After 1865, either completely separate provision was made in existing asylums or separate and distinctly unequal “colored” asylums were established in which to confine them. Tennessee (1866), Virginia (1869), North Carolina (1880), Mississippi (1889), and Alabama (1902) opened segregated institutions. Segregated wards were set up in various states, including West Virginia (1864), Missouri (1865), Georgia (1866), and Arkansas (1882).52 North or South, there was a broad consensus: racially integrated wards were inconceivable, and “virtually no member of the psychiatric profession was willing to challenge the dominant separate but equal doctrine that was the basis of public policy.”53

Just how stark the differences in the treatment of Black Americans were has been detailed in a number of recent analyses of asylums in the South—in Georgia, Louisiana, Virginia, and Washington, DC. Collectively, they show the profound differences in the way African American patients were viewed and treated both by psychiatrists and the staff of these institutions. Martin Summers has helpfully documented the large discrepancies in the use of restraints and seclusion between the Black and white inmates at the federal hospital in Washington, DC.54 And he and Elodie Edwards-Grossi show the equally profound differences in the labor demanded of Black inmates compared with their white counterparts. If most American asylums were massively underfunded and overcrowded, conditions in the “colored” wards and institutions were far worse. Some Black patients sought to resist what they perceived as slave labor.55

The heads of mental hospitals in the South were unapologetic about the differential treatment of the races, insisting that it was “essential” and beneficial to both races. Under slavery, T. O. Powell proclaimed, “there were, comparatively speaking, few negro lunatics. Following their sudden emancipation their number of insane began to multiply, and, as accumulating statistics show, the number is now alarmingly large and on the increase.”56 He and his colleagues knew why: as primitive people barely removed from savagery, they were simply not equipped to cope with the pressures of a free society.57 In reality, slaveholders had been reluctant to assume the cost of institutionalizing the mentally ill; some even resorted to “freeing” them rather than providing for them.58

The “necessity” of segregating Black patients was adopted as an article of faith by the first generation of asylum physicians, and it exercised a continuing hold on the profession.59 In 1914, Mary O’Malley, a junior psychiatrist on the staff of St. Elizabeths in Washington, DC, explained the origins of “psychoses in the colored race” in the pages of the profession’s journal, the American Journal of Insanity: “300 years ago the negro ancestors of this race were naked dwellers on the west coast of Africa in the depths of savagery and suddenly transplanted to an environment of the highest civilization, and 250 years later had all the responsibilities of this higher race thrust upon them.”60 Small wonder that so many of them became psychotic.

Seven years later, her colleague W. M. Bevis echoed these views. “The colored race,” he contended, were simply too primitive to cope with the stresses of a free society. They were, after all, the descendants of “savages or cannibals in the jungles of central Africa.” Consequently, their “biological development” left them “poorly prepared” for emancipation, though “their talent for mimicry is remarkable sometimes sufficiently exact to delude the uninitiated into the belief that the mental level of the negro is only slightly inferior to that of the Caucasian.” Nothing, of course, could be further from the truth.61 Persuaded that their happy-go-lucky nature and primitive emotional equipment meant that few suffered from depression, psychiatrists believed they disproportionally exhibited the symptoms of dementia praecox and mania.

The early twentieth-century anti-immigrant sentiment thus built on decades of prejudice and unequal treatment that were most deeply embedded in “scientific” beliefs about the biological inadequacies of African Americans. If the Grants, the McKims, and the Davenports directed their ire at Jews, Italians, Greeks, Poles, and Russians, that did not mean that they were not even more convinced of the inferiority of Black Americans. It was simply that those prejudices were so deeply rooted and widely shared in the dominant culture as to be not worth mentioning. Segregation and neglect were seemingly immutable facts of life throughout the asylum era. For the most part, psychiatrists shared the profound racial prejudices of the larger society and even helped give a scientific gloss to those beliefs. Those who did not share the broad consensus were largely complaisant, reluctant to make waves or simply oblivious to the racial inequities of the mental health system.


ASYLUM DOCTORS and their allies had by the early decades of the twentieth century put in place yet another set of policies that derived from their belief in the heritability of mental illness. Sexual surgery to “cure” female patients of their insanity had enjoyed something of a vogue in the last three decades of the nineteenth century. By and large, the operations had been performed on “nervous” cases outside asylum walls, mostly by those associated with the newly emerging field of gynecology. Pioneered by a Georgia physician, the aptly named Robert Battey, some thousands of “normal ovariotomies”—the surgical removal of apparently healthy ovaries for the relief of mental distress—were carried out all across the United States and on a smaller scale in Europe.62

Asylum superintendents, resentful of the encroachments of another group of medical men on their turf and perhaps worried that embarking on a novel, potentially controversial, and highly experimental therapeutics would be perilous, for the most part shunned the operation. By the turn of the century, however, a handful of asylums in the United States and Canada had dispensed with these scruples and embarked on ovariotomies—on occasion (as at Norristown State Hospital in Pennsylvania) through the efforts of female physicians in charge of patients of their own sex.63

Their interventions were short-lived. There had been much in nineteenth-century medicine’s views of “the female animal” that lent seeming substance to the claim that an operation on a woman’s reproductive organs might influence her mental state for the better, but by the 1890s, as elite physicians began for the first time to gain some understanding of the role of the endocrine system in the regulation of bodily functions, a growing disquiet emerged about the long-term impact of such drastic and mutilating surgery.64 The use of normal ovariotomies on asylum inmates was soon halted, and experiments with castration, attempted in some homes for the feeble-minded and institutions for juvenile delinquents, were met with fierce criticism, in at least one case costing the physician who had performed them his job.65

But less drastic surgeries—severing the vas deferens of the male and tying the fallopian tubes of the female—were now available (the first vasectomy was performed in 1897), and these spared the testes and ovaries, preserving their role in the body’s hormonal system. In short order, the availability of these alternatives led states to adopt statutes permitting involuntary sterilization of the unfit. Indiana passed the first such law in 1907, after several failed attempts to pass similar legislation in New York and Pennsylvania, and soon many other states followed suit. Though seven of the first sixteen laws were subsequently struck down by the courts, others passed constitutional muster and were enthusiastically implemented, nowhere more so than in the progressive state of California.66

Legislation permitting such operations was passed in Sacramento in 1909. Its eugenic aims were made explicit in a 1917 amendment, which spoke of the need to prevent reproduction by those with “mental disease which may have been inherited and is likely to be transmitted to descendants.”67 By 1921, California had performed more than 2,000 sterilizations, 80 percent of all such operations in the country, and thousands more were performed over ensuing decades—11,491 by 1950, out of a total for the United States of some 23,466.68 Margaret Smyth, the superintendent of California’s Stockton State Hospital, was a particular enthusiast. She noted with pride in 1938 that California’s aggressive moves to prevent the reproduction of the unfit had served as a salutary example and been emulated in Nazi Germany:

California adopted its first sterilization law April 26, 1909. This law has attracted attention from countries all over the world. The German government applies sterilization systematically in accordance with its law, the total number of operations to date having reached something like 250,000. Investigators agree that the policy there is being enforced in a scientific spirit without racial or political implications and with a minimum of difficulty. The leaders in the German sterilization movement state repeatedly that their legislation was formulated only after careful study of the California experiment. It would have been impossible they say, to undertake such a venture involving one million people, without drawing heavily upon previous experience elsewhere.69

A decade before Smyth spoke so proudly of her state’s place in the vanguard of psychiatric progress, a test case of the constitutionality of involuntarily sterilizing the mentally defective had reached the United States Supreme Court. Carrie Buck was a twenty-one-year-old woman who had been a resident of the Virginia Colony for the Epileptic and Feeble-minded in Lynchburg for three years before, on the morning of October 19, 1927, she was wheeled into the operating theater to have her fallopian tubes severed, cauterized, and tied. She had been pregnant on admission, apparently the aftermath of being raped by her cousin, in whose parents’ house she had lived after her mother had been institutionalized as feeble-minded. Carrie’s out-of-wedlock pregnancy had been the primary reason for her own subsequent confinement in the same Colony as her mother. (Her baby, incidentally, was “disposed of,” as the Colony’s official records put it, by being handed over to her aunt and uncle, who took their grandchild with the understanding that the little girl “will be committed later on if it is found to be feeble-minded also.”)70

Buck was unaware of why she was being operated on, just as she did not know that, months previously, after Virginia had passed a statute permitting involuntary sterilization, she had been carefully selected by the law’s defenders to serve as the test case of its constitutionality. The state had provided her with a lawyer who proceeded to sue the Colony’s superintendent, Dr. John Bell, for operating on her. The original trial was little more than a formality. Buck’s lawyer made little effort to challenge the factual basis of the state’s version of the case, and it passed quickly through Virginia’s state courts before being appealed to the United States Supreme Court. Here the lawyer did finally speak eloquently about the underlying constitutional issue, whether the 14th Amendment provided a guarantee of the individual’s right to bodily integrity. But as the measure’s backers had hoped, the Justices were unmoved. The majority ruled 8–1 that there was no constitutional obstacle to the involuntary sterilization of an American citizen. Oliver Wendell Holmes, Jr., widely regarded as one of the most eminent jurists in the nation’s history, was assigned the task of writing the opinion. He ringingly endorsed the state’s position: “It is better for all the world,” he wrote, “if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind. The principle that sustains compulsory vaccination is broad enough to cover cutting Fallopian tubes. Three generations of imbeciles are enough.”71 In a rich bit of historical irony, the parents of the politician who wrote the Virginia statute, and who subsequently successfully argued its constitutionality before the United States Supreme Court, had both died of insanity in Virginia asylums. Circumstantial evidence strongly suggests that they were driven mad by what was then seen as the quintessential disease of the degenerate classes, tertiary syphilis.

Holmes’s decision had a broad impact. Indiana, which had passed the first law allowing for involuntary sterilization two decades earlier, had seen that statute invalidated in the courts. It now became the first state to pass a new statute created to pass constitutional muster by employing the reasoning behind the Buck v. Bell decision. North Dakota soon followed suit, as did Mississippi, and by 1931, four years after the Supreme Court ruling, twenty-eight states had enacted legislation that would pass the test established in Buck v. Bell. That same brief period saw as many involuntary sterilizations undertaken in the United States as the total number performed in the preceding two decades.72

A year later, on July 3, 1932, Buck’s daughter, Vivian, the third-generation “imbecile” Holmes had denounced, died at eight of complications from measles—but not before she had been placed on the academic honor roll at the elementary school she attended in Charlottesville.