5
BRAIN DAMAGE AS MIRACLE THERAPY
011
It has been said that if we don’t think correctly, it is because we haven’t “brains enough.” Maybe it will be shown that a mentally ill patient can think more clearly and constructively with less brain in actual operation.
—Walter Freeman, 19411
 
 
 
 
INSULIN COMA, METRAZOL, and electroshock had all appeared in asylum medicine within the space of a few years, and they all “worked” in a similar manner. They all dimmed brain function. Yet they were crude methods for achieving this effect. With these three methods, there was no precise control over the region of the brain that was disabled, nor was there control over the degree to which the brain was traumatized. The approach, said one physician, seemed akin to “trying to right a watch with a hammer.”2 However, during this same period, there was one other therapy that was introduced into asylums which was not so imprecise, and it was this last therapy that, in the 1940s, became psychiatry’s crowning achievement. Newspapers and magazines wrote glowing articles about this “miracle” of modern medicine, and, in 1949, fourteen years after its introduction, its inventor, Portuguese neurologist Egas Moniz, was awarded a Nobel Prize.
That therapy, of course, was prefrontal lobotomy.

Inspiration . . . Or a Clear Warning?

The frontal lobes, which are surgically disabled during prefrontal lobotomy, are the most distinguishing feature of the human brain. Put an ape brain and a Homo sapiens brain side by side, and one difference immediately jumps out—the frontal lobes in the human brain are much more pronounced. This distinguishing anatomy, so visible at autopsy, led philosophers as far back as the Greeks to speculate that the frontal lobes were the center for higher forms of human intelligence. In 1861, long before Moniz picked up his drill, the great French neurologist Pierre Paul Broca pointed to the frontal lobes as the brain region that gives humankind its most noble powers:
The majesty of the human is owing to the superior faculties which do not exist or are very rudimentary in all other animals; judgment, comparison, reflection, invention and above all the faculty of abstraction, exist in man only. The whole of these higher faculties constitute the intellect, or properly called, understanding, and it is this part of the cerebral functions that we place in the anterior lobes of the brain.3
Scientific investigations into frontal-lobe function had been jump-started a few years earlier by the remarkable case of Phineas Gage. Gage, a twenty-five-year-old Vermont railroad worker, was preparing a hole for blasting powder when an explosion drove a 3.5-foot iron rod into his left cheek and through his frontal lobes. Incredibly, he survived the accident and lived another fifteen years. But the injury dramatically changed him. Before, others had admired him as energetic, shrewd, and persistent. He was said to have a well-balanced mind. After his accident, he became ill mannered, stubborn, and rude. He couldn’t carry out any plans. He seemed to have the mind of a spoiled child. He had changed so radically that his friends concluded that he was “no longer Gage.”
Over the next eighty years, animal research revealed similar insights about the importance of the frontal lobes. In 1871, England’s David Ferrier reported that destroying this brain region in monkeys and apes markedly reduced their intelligence. The animals, selected for their “intelligent character,” became “apathetic or dull or dozed off to sleep, responding only to the sensations or impressions of the moment.”4 Their listlessness was periodically interrupted by purposeless wanderings. Italian neurologist Leonardo Bianchi, who conducted lobotomy experiments in dogs, foxes, and monkeys, concluded in 1922 that the human intelligence responsible for creating civilization could be found in the frontal lobes.
In the 1930s, Carlyle Jacobsen at Yale University conducted studies with two chimps, Becky and Lucy, that highlighted the importance of the frontal lobes for problem solving. He tested this skill through a simple experiment. Each chimp would be placed into a chamber and allowed to watch while food was placed beneath one of two cups. A blind would be lowered, hiding the cups from view, and then, five minutes later, the blind would be raised and the chimp would be given an opportunity to get the food by picking the right cup. After their frontal lobes were removed, Becky and Lucy lost their ability to solve this simple test. The frontal lobes, Jacobsen concluded, were responsible for an organism’s adjustment to its environment. This region of the brain synthesized information, including memories formed from recent events, and it was this process that produced intelligent action.5
By this time, numerous clinical reports had also documented the effects of severe head wounds. After World War I, Gage’s story was no longer such an anomaly. Clinicians reported that people with frontal-lobe injuries became childish and apathetic, lost their capacity to plan ahead, and could not make sound judgments. Similarly, cancer patients who had frontal-lobe operations because of brain tumors were said to act in puerile ways, to lack initiative and will, and to display emotions that seemed flattened or out of sync with events. Frontal-lobe injuries led to a recognizable syndrome, dubbed “Witzelsucht,” that was characterized by childish behavior.
None of this intellectual loss and behavioral deterioration following frontal-lobe injury was surprising. If anything, physicians voiced surprise that the intellectual deficits weren’t greater. It was remarkable that Gage, who’d had a rod go completely through the front of his brain, could function as well as he did. Ferrier had noted that the extent of the intellectual deficit in his lobotomized monkeys was not immediately evident, but rather became apparent only after some time. People with frontal-lobe injuries were even found to do fairly well on standardized intelligence tests.
Indeed, frontal-lobe injury appeared to produce an odd mixture. The pronounced emotional and problem-solving deficits were accompanied by the retention of a certain mechanical intelligence. Such was the case with Joe A., a New York City stockbroker who developed a brain tumor at age thirty-nine. After Johns Hopkins neurosurgeon Walter Dandy removed the tumor in an operation that caused extensive damage in the prefrontal region of Joe’s brain, Joe became a profoundly different person. In some ways, he functioned remarkably well. He could still play checkers, his memory seemed unimpaired, and he understood what had happened to him. At times, he could socialize well. On one occasion, a group of visiting neurologists spent an hour with him and failed to notice anything unusual. But like Gage, Joe was a changed person. He couldn’t focus his attention any more, he lacked motivation to go back to work, he couldn’t plan daily activities, and he often behaved in emotionally inappropriate ways. He was easily irritated, constantly frustrated, spoke harshly of others, and became a hopeless braggart. He would see boys playing baseball and blurt out that he would soon become a professional ballplayer, as he was a better hitter than anyone. On IQ tests he now scored below ninety, and he could do well only with familiar material. His capacity to learn had disappeared.
Together, the animal studies and clinical reports of head injuries seemingly pointed to a stark conclusion: Destroying tissue in this brain region would cause many intellectual and emotional deficits. The person would likely become more apathetic, lack the ability to plan ahead, be unable to solve problems, and behave in puerile, emotionally inappropriate ways. Witzelsucht was not a kind fate. Yet in 1935, Portuguese neurologist Egas Moniz saw something encouraging in these reports. He found reason to believe that inflicting injury on the frontal lobes could prove beneficial to the mentally ill.

Planting the Seed

The idea of drilling holes into the brains of the mentally ill to cure them was not, in the 1930s, new to psychiatry. As far back as the twelfth century, surgeons had reasoned that trepanning, which involved cutting holes in the scalp, allowed demons to escape from a poor lunatic’s brain. In 1888, Gottlieb Burckhardt, director of an asylum in Prefarigier, Switzerland, had removed part of his patients’ cerebral cortex to quiet their hallucinations. “If we could remove these exciting impulses from the brain mechanism,” he wrote, “the patient might be transformed from a disturbed to a quiet dement.”6 Although one of his six patients died, Burckhardt concluded that it did make the others more peaceful. Twenty years later, a Russian surgeon, Ludwig Puusepp, tried to cure three depressed patients by cutting into their frontal lobes. But when he didn’t find it particularly helpful, the notion was pushed to the background of psychiatric research.
Moniz resurrected it at a very telling time in his career. In 1935, Moniz was sixty-one years old. He’d led a colorful, prosperous life, but he had never realized his grandest dreams. As a young man, newly graduated from medical school, he’d thrown himself into political struggles to replace Portugal’s monarchy with a democratic government, a struggle that twice landed him in jail. After a new government was established in 1910, he was elected to the Portuguese Parliament and served as ambassador to Spain. Wherever he went, he lived the good life; the parties that he and his wife gave were known for their elegance, style, and good food. But in 1926, it all came tumbling down when the Portuguese government was overthrown in a military coup. Disappointed, even bitter, over the loss of his beloved democracy, he turned his attention full time to medicine and his neurology practice. He’d long juggled his life as an academic physician, on the faculty at the University of Lisbon, with that of his life in politics, and he set his sights on making a lasting contribution to medicine. “I was always dominated by the desire to accomplish something new in the scientific world,” he recalled in his memoirs. “Persistence, which depends more on willpower than intelligence, can overcome difficulties which seem at first unconquerable.”7
Moniz quickly found himself on the verge of the fame he so avidly sought. In 1928, he was nominated for the Nobel Prize in medicine for inventing a technique for taking X-rays of cerebral arteries. He didn’t win, though, and he found himself obsessed with the prize. Over the next few years, he actively campaigned to be renominated for the award, at times wielding his pen to disparage others working on similar blood-imaging techniques, fearful that their achievements might diminish his own. Although he was nominated for the Nobel Prize again in 1933, once more the award went to another scientist, and it seemed certain now that the top honor would never be his. That is, until he traveled in August 1935 to London to attend the Second International Congress in Neurology.
That year, the conference featured an all-day symposium on the frontal lobes. A number of speakers presented their latest research on this region of the brain. American neurologist Richard Brickner provided an update on Joe A., his tumor patient. Jacobsen detailed his experiments with the chimps Lucy and Becky. Although fascinating, the reports led to a sobering conclusion. “There is little doubt,” wrote George Washington University neurologist Walter Freeman, “but that the audience was impressed by the seriously harmful effects of injury to the frontal lobes and came away from the symposium reinforced in their idea that here was the seat of the personality and that any damage to the frontal lobes would inevitably be followed by grave repercussions upon the whole personality.”8
Moniz, however, plucked from the presentations a different message. The reports by Jacobsen and Brickner had set his mind churning. Jacobsen, after detailing the cognitive deficits in the chimps following lobotomy, had noted that the surgery also produced a marked emotional change in one of them, Becky. Before the surgery, she had typically reacted angrily when she failed to pick the right cup in the food experiment. She would roll on the floor, defecate, or fly into a rage. But after the surgery, nothing seemed to bother her. If she failed to solve a problem, she would no longer throw an emotional tantrum. It was as though she had joined a “happiness cult” or placed her “burdens on the Lord,” Jacobsen said.9 Brickner’s account of Joe A. had made even a deeper impression on Moniz. Although Joe may have changed after his frontal lobes were damaged, apparently he could still be sociable and converse in a relatively normal way. All of which set Moniz to thinking: Could the same be said of time spent with the mad, the emotionally distressed? Who didn’t immediately notice their illness? Joe A., Moniz figured, functioned at a much higher level than those ill with schizophrenia or severe depression. What if he deliberately injured both frontal lobes of the mentally ill in order to cure them? After all, Joe A. could “still understand simple elements of intellectual material,” he reasoned. “Even after the extirpation of the two frontal lobes, there remains a psychic life which, although deficient, is nevertheless appreciably better than that of the majority of the insane.”10
Moniz, who prided himself on being a man of science, quickly came up with a neurological explanation for why such surgery would cure the mentally ill. Thoughts and ideas, he reasoned, were stored in groups of connected cells in the brain. Schizophrenia and emotional disorders resulted from pathological thoughts becoming “fixed” in such “celluloconnective systems,” particularly in the frontal lobes. “In accordance with the theory we have just developed,” he said, “one conclusion is derived: to cure these patients we must destroy the more or less fixed arrangements of cellular connections that exist in the brain.”11
Three months after returning from London, Moniz chose a sixty-three-year-old woman from a local asylum to be his first patient. He knew his reputation was at stake. Should the operation fail, he would be condemned for his “audacity.” The woman, a former prostitute, was paranoid, heard voices, and suffered bouts of crippling anxiety. Moniz’s assistant, Almeida Lima, performed the surgery: He drilled holes into her skull, used a syringe to squirt absolute alcohol onto the exposed white fibers, which killed the tissue through dehydration, and then sewed her back up. The whole operation took about thirty minutes. Just hours later, she was able to respond to simple questions, and within a couple of days, she was returned to the asylum. A young psychiatrist there soon reported that the woman had remained calm, with her “conscience, intelligence, and behavior intact,” leading Moniz—who’d hardly seen her after the operation—to happily pronounce her “cured.”12
Within three months, Moniz and Lima operated on twenty mentally ill patients. During this initial round of experimentation, they continually increased the scope of brain damage. The greater the damage, it appeared, the better the results. More holes were drilled, more nerve fibers destroyed. Starting with the eighth patient, Lima began using a thin picklike instrument with a wire loop, called a leucotome, to cut the nerve fibers in the frontal lobes. Each cutting of nerve tissue within was counted as a single “coring”; by the twentieth patient, Lima was taking six such corings from each side of the brain. They also concluded that while the surgery didn’t appear to help schizophrenics, it did reliably make those ill with manic depression less emotional. That was all the change that Moniz needed to see. In the spring of 1936, he announced his stunning success: Seven of the twenty patients had been cured. Seven others had significantly improved. The other six were unchanged. “The intervention is harmless,” Moniz concluded. “None of the patients became worse after the operation.”13
Moniz had achieved the triumph he’d long sought. All his fears could now be put to rest. He was certain that his surgery marked “a great step forward.” Within a short period, he churned out a 248-page monograph, Tentatives opératoires dans le traitement de certaines psychoses, and published his results in eleven medical journals in six countries.e Reviewers in several countries found his lengthy monograph impressive, and none was more enthusiastic than an American, Walter Freeman. Writing in the Archives of Neurology, he suggested that, if anything, Moniz had been too “conservative” in his declarations of success. From Freeman’s perspective, Moniz’s count of seven cures and seven improvements understated the “striking” results the surgery had apparently produced.14

Surgery of the Soul

Like Moniz, Walter Freeman was a prominent physician driven by ambition. By 1935, he had an accomplished résumé. Only forty years old, he was a faculty member at both Georgetown and George Washington University medical schools, the author of a well-received text on neuropathology, and head of the American Medical Association’s certification board for neurology and psychiatry, a position that recognized him as one of the leading neurologists in the country. Yet for all that, he could point to no singular achievement. He’d analyzed more than 1,400 brains of the mentally ill at autopsy, intent on uncovering anatomical differences that would explain madness, but had found nothing. This research had proven so barren that Freeman sardonically quipped that whenever he encountered a “normal” brain, he was inclined to make a diagnosis of psychosis. He also was a bit of an odd bird. Brilliant, flamboyant, acerbic, cocky—he wore a goatee and seemed to enjoy prickling the sensibilities of his staid colleagues. He taught his classes with a theatrical flair, mesmerizing his students, in particular with in-class autopsies. Freeman would remove a corpse’s skullcap with a saw and then triumphantly remove the brain, holding it up to teach neuroanatomy.15
Moniz’s surgery had a natural allure for him—it was bold, daring, and certain to ruffle a few professional feathers. It also fit into his own thinking about possible remedies for the mentally ill. Even before Moniz had published his results, he’d suggested, in a paper titled “The Mind and the Body,” that brain surgery could find a place in psychiatry’s toolbox. Illnesses like encephalitis and syphilis attacked distinct regions in the brain, he’d noted, and those diseases caused alterations in behavior. If a viral agent could change a person’s actions, couldn’t a neurosurgeon do the same with his knife? “We may be able to influence behavior in a significant manner by destroying localized portions” of the brain, he’d concluded.16
Freeman recruited a young neurosurgeon, James Watts, to be his collaborator. Their first patient was, like Moniz’s, a sixty-three-year-old woman, A. H. She suffered from severe depression, was suicidal, and obsessed about growing old. Freeman described her as a “master at bitching” who so domineered her husband that he led “a dog’s life.” Although her family consented to the experiment, she protested that she didn’t want any part of it if it would require cutting her hair. Freeman mollified her by assuring her that her precious curls would not be shorn, and on September 14, 1936, he and Watts cut six corings from each of her frontal lobes. The operation went smoothly, and after awaking from anesthesia, A. H. reported that she felt better and that she was no longer sad. She expressed no concern that Freeman had lied to her and that her hair was now gone.17
Freeman and Watts wasted no time in announcing their positive results. Before two months had passed, they’d fired off an article to the Southern Medical Journal, claiming success. A. H., they said, was now “content to grow old gracefully,” was able to manage household chores “as well as she ever did,” and enjoyed “the company of her friends who formerly used to exhaust her.” Her husband found her “more normal than she had ever been.” By the end of the year, Freeman and Watts had operated on sixteen more women and three men. Their published conclusions remained up-beat. Not only did the operation relieve emotional distress, but any intellectual loss was apparently minimal. Memory was described as intact, concentration improved, and judgment and insight undiminished. The patients’ ability to enjoy external events had increased. The one negative, Freeman and Watts wrote, was that “every patient probably loses something by this operation, some spontaneity, some sparkle, some flavor of the personality, if it may be so described.” But that loss seemed acceptable in patients who “have an otherwise hopeless prognosis,” they said.18
Freeman proved even better than Moniz at publicizing his and Watts’s surgical triumph. Just before he presented the results of their first six surgeries at a meeting of the Southern Medical Society on November 18, 1936, he called a Washington Star reporter, Thomas Henry, and gave him an “exclusive.” That stirred other reporters into a near frenzy, just as Freeman had hoped. The New York Times wrote that their “new operation marked a turning point of procedure in treating mental cases,” their work likely to “go down in medical history as another shining example of therapeutic courage.” Time, Newsweek, and other national publications trumpeted their accomplishments as well, and Freeman, enjoying this blush of fame, gradually made ever more startling claims. His new “surgery of the soul,” the New York Times reported, in a June 7, 1937, article that appeared on its front page, could relieve “tension, apprehension, anxiety, depression, insomnia, suicidal ideas, delusions, hallucinations, crying spells, melancholia, obsessions, panic states, disorientation, psychalgesia (pain of psychic origin), nervous indigestion and hysterical paralysis.” The operation, the paper added, “transforms wild animals into gentle creatures in the course of a few hours.”19
This was astounding stuff. People from around the country sent letters to Freeman and Watts asking about this amazing new operation. If worry, depression, and anxiety could be plucked neatly from the brain, there was no telling what other conditions could be miraculously treated with their amazing leucotomes. Perhaps asthma could be removed from the brain. Or mental retardation? Their very souls apparently could be carved for the better. After the first round of twenty surgeries, Freeman and Watts also altered the operation so that the frontal lobes would be disabled in a more “precise” way.f Instead of drilling into the skull from the top, they cut into the brain from the lateral sides, varying the scope of frontal-lobe damage depending on the patient’s diagnosis. For those suffering from emotional disorders, they would make their cuts toward the front of the skull. For those with chronic schizophrenia, they would drill into the frontal lobes farther back. The more posterior the entry point, the larger the portion of the frontal lobes that would, in essence, be disconnected from the rest of the brain.
The human mind, it seemed, could be neatly fixed—and even improved—by the surgeon’s knife. As Freeman proudly wrote, lobotomy “was a stroke at the fundamental aspect of the personality, that part that was responsible for much of the misery that afflicts man.”21

The Stamp of Approval

Although the positive results announced by Freeman and Watts created a great stir in psychiatry and in the press, neurosurgeons as a group did not rush to perform the operation. This surgery was clearly a profound one, which gave most physicians great pause. Insulin coma, metrazol, and electroshock may have worked by inflicting trauma on the brain, but there was still much debate over how severe that trauma was or whether it led to permanent damage. With lobotomy, it was clear: This was an operation that permanently destroyed a part of the brain thought to be the center of human intelligence. Did one really dare to do that? With that question hanging in the air, fewer than 300 lobotomies were performed in the United States from 1936 to 1942. But gradually over that period wariness about the operation waned, and it did so for an understandable reason. Nearly all those who tried the operation concluded that it worked wonders.
After Freeman and Watts, the first American neurosurgeon to try lobotomy was James Lyerly, in Jacksonville, Florida. By early 1938, he had performed the surgery on twenty-one patients. The majority he chose for the operation suffered from depression and other emotional disorders, and many had been ill less than a year. He reported spectacular results. Patients who had been painfully worried and anxious had become relaxed and cheerful and were able to laugh once more. They’d gained weight, their “radiant” faces reflecting their new inner happiness. Nor did it appear that such transformation had come at any great cost. In none of the patients, Lyerly wrote, was there any evidence that disconnecting the frontal lobes had affected “the patient’s judgment, reasoning, or concentration, or his ability to do arithmetic.” They could now “think better and do more work than before.” All of the hospitalized patients had either been discharged, or would be soon.22
Lyerly presented his results at a meeting of the Florida Medical Association in May 1938, and it convinced his peers that they too needed to start doing the surgery. J. C. Davis, president of the Florida State Board of Medical Examiners, called the outcomes “nothing less than miraculous.” Other psychiatrists joined in to praise Lyerly, concluding that the value of such an operation, for patients who otherwise had no hope, “cannot be overrated.” All psychiatrists now had an obligation, reasoned P. L. Dodge, to bring this operation “before the rest of the world for the benefit of every patient who suffers from this disease so they might avail themselves of this particular operation.” Dodge promised to immediately write the families of his patients and urge them to have their loved ones undergo lobotomy as soon as possible, before they became hopelessly deteriorated.23
Other physicians soon reported similar results. Francis Grant, chief of neurosurgery at the University of Pennsylvania, and a close friend of Watts, operated on ten patients at Delaware State Hospital. Seven, he said, had returned home after the surgery. Two of his anecdotal accounts told of remarkable revivals. Prior to the surgery, Sally Gold had been “entirely hopeless.” A year later, she was engaged and had invited Grant to attend the wedding. Julia Koppendorf’s story was much the same. Before undergoing a lobotomy, she had been so engulfed in depression that her life was “little worth living,” Grant said. Twelve months later, her nephew reported that she was now quite normal.24
Patients, too, were quoted as singing the praises of the surgery. They were said to write letters of gratitude, detailing their newfound happiness and how their lives had been born anew. Watts received a touching poem from one of his patients.
Gentle, clever your surgeon’s hands
God marks for you many golden bands
They cut so sure they serve so well
They save our souls from Eternal Hell
An artist’s hands, a musician’s too
Give us beauty of color and tune so true
But yours are far the most beautiful to me
They saved my mind and set my spirit free.25
Pennsylvania Hospital’s Edward Strecker found that the surgery even benefited schizophrenics. Both Moniz and Freeman had determined that it didn’t help this group of patients—although they became less emotional, their delusions didn’t subside—but Strecker found otherwise. His chronic patients had been miraculously reborn. “Disregard of others,” Strecker wrote, “has been replaced by manifestations of thoughtfulness, consideration, and generosity.” Artistic and athletic skills were said to be revived. Whereas before the schizophrenic patients had been lost to the world, they now happily thought about the future, eagerly anticipating going on trips, taking cruises, and going to the theater. They scorned the voices that had once tormented them as “silly” and unworthy of heeding.26
As had been the case with other published reports, Strecker’s anecdotal accounts gripped the imagination. Strecker told of one previously lost soul—in the hospital, she had mutilated herself, wouldn’t wear clothes, and had not responded to any other therapies—who had turned into a Good Samaritan hero. While on an outing, she rescued a friend who had been thrown from a horse—applying first aid, stopping a car to get help, accompanying her friend to the hospital, and waiting there until she was out of danger. The disconnection of her frontal lobes had apparently made her both resourceful and compassionate. Another of Strecker’s lobotomized patients, a twenty-five-year-old woman, had become the mother of a beautiful baby, was working as a hostess at a resort, and played golf so well that she could compete in highly competitive tournaments. Perhaps most impressive, she had “retained all of her intellectual capacity.”
In 1943, Lloyd Ziegler tallied the lobotomy results to date. By that time, there had been 618 lobotomies performed at eighteen different sites in the United States and Canada. Five hundred and eighteen patients were “improved” or “recovered”; 251 were living in the community and working full or part-time. Twelve people had died from the operation. Only eight had worsened following the surgery. “We have known for a long time that man may get on with one lung or one kidney, or part of the liver,” Ziegler concluded. “Perhaps he may get on, and somewhat differently, with fewer frontal fiber tracts in the brain.”27
The surgery had passed the test of science. There could no longer be any doubt that the operation greatly benefited the seriously mentally ill.

The Untold Story

Even today, the published study results are stunning to read. The anecdotal reports of lives restored—of hand-wringing, suicidal people leaving hospitals and resuming lives graced by jobs and marriage—are particularly compelling. As they came from physicians with the best credentials, one begins to wonder whether history has been unfairly harsh on lobotomy. We remember it as a mutilating surgery, but perhaps that isn’t so. Perhaps it was a worthwhile operation, one that should be revived.
Either that, or there was something missing from the clinical reports.
A fuller view of the effects of lobotomy can be found today, and ironically, it comes from Freeman and Watts. In their 1950 book Psychosurgery, they detailed their experiences during more than ten years of performing lobotomies, and as might be expected, they had long-term good news to report. The operation had helped more than 80 percent of the 623 patients they had operated on. Yet it is in this book, which was meant to present lobotomy in a favorable light, that a clear historical picture emerges of just how the surgery transformed the mentally ill. As part of their discussion, Freeman and Watts told families what to expect from patients recovering from lobotomies. Their candid advice, designed to keep families’ expectations in check, tells an entirely different story than that depicted in the medical literature.
People who underwent a lobotomy went through various stages of change. In the first weeks following the operation, Freeman and Watts wrote, patients were often incontinent and displayed little interest in stirring from their beds. They would lie in their beds like “wax dummies,” so motionless that nurses would have to turn them to keep them from getting bedsores. Relatives would not know what to make of their profound indifference to everything around them, Freeman and Watts said: “[The patient] responds only when they keep after him and then only in grunts; he shows neither distress nor relief nor interest. His mind is a blank . . . we have, then a patient who is completely out of touch with his environment and to whom the passage of time means nothing.”28
To stir patients, physicians and nurses would need to tickle them, pound on their chests, or grab them by the neck and “playfully throttle” them. When finally prodded to move, patients could be expected to behave in unusual ways. One well-bred lady defecated into a wastebasket, thinking it was a toilet. Patients would “vomit into their soup plates and start eating out of the plate again before the nurse [could] take it away.” They would also lose any sense of shame. Patients who were stepping out of the shower or were on the toilet would not be the least bit embarrassed when doctors and nurses came into the bathroom.
In this newly lethargic, shameless state, patients who once had been disruptive to the wards now caused fewer problems. Even patients who had been violent before the operation were likely to behave in quieter ways, Freeman and Watts said.
We vividly recall a Negress of gigantic proportions who for years was confined to a strong room at St. Elizabeths Hospital. When it came time to transfer her to the Medical Surgical Building for operation five attendants were required to restrain her while the nurse gave her the hypodermic. The operation was successful in that there were no further outbreaks . . . from the day after operation (and we demonstrated this repeatedly to the timorous ward personnel) we could playfully grab Oretha by the throat, twist her arm, tickle her in the ribs and slap her behind without eliciting anything more than a wide grin or a hoarse chuckle.29
Lobotomy was to be seen as a “surgically induced childhood.” As patients began to stir, they would be given coloring books and crayons. Families were advised to bring them dolls or teddy bears to help keep their simple minds occupied. At times, however, patients recovering from the surgery might stir from their lethargy into overly restless behavior. In that case, Freeman and Watts advised stunning them with electroshock, even as early as a week after the brain surgery. “A few electric shocks may alter the behavior in a gratifying manner . . . When employed, it should be rather vigorous—two to four grand mal seizures a day for the first two days, depending upon the result.”30
About 25 percent of their patients never progressed beyond this initial stage of recovery and had to remain institutionalized. Some became disruptive again and underwent a second and even a third surgery; each time Freeman and Watts would disconnect a larger section of their frontal lobes. As long as these patients reached a state where they remained quiet and no longer disturbed the wards as they once had, Freeman and Watts would judge them to have had “good” outcomes.
However, the majority of their patients were able to leave the hospital. In the clinical trials, this was seen as conclusive evidence of a positive outcome. What the medical journals failed to detail, though, was the patients’ behavior once they returned home. A lobotomized patient was likely to sorely try a family’s patience.
The patient’s extreme lethargy and lack of initiative were likely to remain present, particularly during the first months. Families would need to pull their loved ones from their beds, as otherwise they might never rise. Freeman and Watts noted that even a full bladder might not rouse the patient:
It is especially necessary for somebody to pull him out of bed since he won’t go to the toilet, and only alertness on the part of those who care for him will prevent a lot of linen going unnecessarily to the laundry. Once the patient has been guided faithfully to the toilet, he may take an hour to complete his business. Then he has to be pulled up off the seat. “I’m doing it,” he says. “Just a little while, I’m nearly finished.” Usually he finishes in a very little while, but the passage of time means nothing to him and he stays on, not thinking, merely inert. If other members of the family are waiting for the use of the bathroom, this type of behavior can be exasperating.31
Families could expect that getting their loved ones dressed, undressed, and bathed would be a chore. They would spend hours in the tub, not washing but, “like little children,” spending their time “squirting water around.” As they lacked any sense of shame, they sometimes would “present themselves to acquaintances and even strangers inadequately clad.” They would likely put on weight, some women getting so fat they would “burst the seams of their dresses and not take the trouble to sew them up.” At the table, many would focus single-mindedly on eating, at times grabbing food from the plates of others. This type of behavior, Freeman and Watts cautioned, should be “discouraged from the start.”32
But efforts to get them to improve their manners were likely to prove futile. “No amount of pleading, reasoning, tears or anger” would do any good. Nor would criticism. Hurl the most insulting epithets at them, Freeman and Watts said, and they would just smile. In fact, “the more insulted they are, the better the patients seem to enjoy it.” Even physical abuse might not bother them.
Patients who have undergone prefrontal lobotomy can stand an enormous amount of maternal overprotection, paternal rejection, sibling rivalry, physical discomfort, strained family situations and loss of loved ones. These happenings in the family constellation make no deep emotional impression upon them . . . occasionally they will cry in response to an external stimulus like the sad part of a movie or a radio act. For themselves and their own sometimes pitiable states, however, they do not mourn. Some patients have taken serious beatings—financial, occupational, even physical—and have come up smiling.33
About 25 percent of discharged patients, Freeman and Watts wrote, could be “considered as adjusting at the level of a domestic invalid or household pet.” This was not to be seen as a bad outcome, however. These patients, relieved of their mental worries, could now devote their “talents to gossiping with the neighbors or just looking out the window.”
We are quite happy about these folks, and although the families may have their trials and tribulations because of indolence and lack of cooperation, nevertheless when it comes right down to the question of such domestic invalidism as against the type of raving maniac that was operated on, the results could hardly be called anything but good.34
Even if the patient had been employed a short time before the surgery, Freeman and Watts still considered the operation to have produced a “fair” result if the patient “becomes a drone, living at home in idleness.” They did express regret, however, that some of their patients in this “category of household drone” had been “highly intelligent, gifted, ambitious, and energetic people” who had been operated on a short time after they had fallen ill and, prior to surgery, “had considerable prospects of returning to an active, useful, existence.”35
Some lobotomized patients did progress beyond this “household pet” level. They were able to become employed again and resume some measure of social life. These were the best outcomes, those who, in the medical literature, were reported to have been miraculously transformed. But, Freeman and Watts cautioned, families shouldn’t expect them to do particularly well in their jobs. The only positions that lobotomized patients could hope to take were simple ones that required a “minimum of punctuality, industry, accuracy, compliance, and adaptability.” Even a task like keeping house would likely prove too difficult because it required juggling multiple tasks and planning ahead. And while their amiable dispositions might help them land work, they would regularly be fired because “the employer expects a certain amount of production.”
Sex was another waterloo. The lobotomized male, Freeman and Watts explained, might begin to paw his wife “at inconvenient times and under circumstances when she may be embarrassed and sometimes it develops into a ticklish situation.” His lovemaking was also “apt to be at a somewhat immature level in that the patient seeks sexual gratification without particularly thinking out a plan of procedure.” It was up to the woman to learn to enjoy such deficiencies:
Refusal [of sex] . . . has led to one savage beating that we know of and to several separations. Physical self-defense is probably the best tactic for the woman. Her husband may have regressed to the cave-man level, and she owes it to him to be responsive at the cave-woman level. It may not be agreeable at first, but she will soon find it exhilarating if unconventional.36
Even at the highest stage of recovery, lobotomized patients could not be expected to provide advice of any merit. Those who had been artists or musicians before becoming ill would never regain much interest in such pursuits. They might play the piano for a while in a mechanical way, but the “emotional exhilaration” that comes from playing would be absent, and eventually they would stop playing altogether. Those who had inventive imaginations before surgery would become “dull and uninspired.” People who “previous to operation had been absorbed in their studies of philosophy, psychology, world affairs, medieval history, and so on, find that their preference turns to action stories, murder mysteries, the sports pages and the comics.” Nor would they, in their lobotomized state, experience spiritual yearnings, any desire to know God.37
Freeman and Watts saw this diminishment as a necessary and even good thing for the mentally ill. Many of their patients had become sick precisely because their minds had been too inventive. Patients who once could find “meaning in the verse of obscure poets” or could imagine what history “would have been like if the early Norsemen had intermarried with the Indians and then descended upon the colonists before they had time to become established” could now live undisturbed by such elaborate mental machinations. Such high-flying imagination, Freeman and Watts wrote, becomes “so entrancing that the individual loses sight of the humdrum pattern of getting an education or earning a living,” and if “creative artistry has to be sacrificed in the process, it is perhaps just as well to have a taxpayer in the lower brackets as the result.” The person who had once painted pictures, written poetry, or composed music was now “no longer ashamed to fetch and carry, to wait on tables or make beds or empty cans.” Their best-outcome patients could be described “as good solid cake but no icing.”38
Such were Freeman and Watts’s description of the behavior of lobotomized patients. Most telling of all, in their book they also reflected on what their patients’ behavior revealed about frontal-lobe function. They had now observed hundreds of Phineas Gages. The frontal lobes, they concluded, are the “highest endowment of mankind.” It is this area of the brain that gives us consciousness of the self, that allows us to experience ourselves and to project ourselves into the past, present, and future. This is the brain center that allows us to care deeply about who we are and our fate. This is the brain region that stirs creative impulses, ambition, a capacity for love, and spiritual yearnings. The Greeks had been right, Broca had been right, and so had Ferrier and Bianchi. The frontal lobes were what made us uniquely human.
And that’s what needed to be taken from the mentally ill.
This mental activity, Freeman and Watts explained, was the source of their suffering. Disconnecting the frontal lobes freed the mentally ill from “disagreeable self-consciousness.” It liberated them from “all sense of personal responsibility and of anxious self-questioning as to the ethical rightness of their conduct.” The lobotomized person, unable to form a mental picture of the “self,” would no long worry about past or future:
He is freed from anxiety and from feelings of inferiority; he loses interest in himself, both as to his body and as to his relation with his environment, no longer caring whether his heart beats or his stomach churns, or whether his remarks embarrass his associates. His interests turn outward, and obsessive thinking is abolished . . . there is something childlike in the cheerful and unselfconscious behavior of the operated patient.39
This was the change described by Freeman and Watts in their first published reports as the loss of a certain “spark” in personality. Lobotomy was not surgery of the soul. This was surgery that removed the soul. As one critic said, lobotomy was a “partial euthanasia.”40 But the trial results published in the medical journals never captured this sense of profound loss. The journal articles conveyed a different reality, telling in general of an operation that could transform hopelessly lost patients on back wards into happy people, some of whom were working and leading fulfilling social lives.
The question that arises today is what drove the creation of that different reality. Why did those who performed this surgery in the late 1930s and early 1940s see their patients’ outcomes through such a rosy lens? For that is clearly what they saw. They perceived this surgery as one that could offer great benefits to the mentally ill.

The Influence of Money

In many ways, the success of lobotomy was foretold before Moniz took up his knife. Ever since the turn of the century, of course, psychiatry had been seeking to transform itself into an academic medical discipline, and that meant it had set its sights on developing modern, science-based treatments. Lobotomy fit this bill perfectly. Brain surgery carried with it the luster of being technologically advanced, born from a keen understanding of how the brain worked. Equally important, the Rockefeller Foundation was providing research funds to produce just this type of success. In the 1920s, the Rockefeller Foundation had identified psychiatry as the medical specialty most in need of reform and had begun providing funding—to the tune of $16 million over the course of twenty years—to achieve this change. Rockefeller money financed new departments of psychiatry at several medical schools. It paid for the creation of research laboratories at the schools as well. Various academic psychiatrists were given money to help introduce new clinical treatments. And the hope, and expectation, was that all of these efforts would come together in a classic fashion: Basic research would lead to a better understanding of the biology of the brain, and that knowledge would lead to new treatments. Once the Rockefeller monies started flowing, the clock started ticking—the vision was clear, and Rockefeller-funded scientists could be expected to help achieve it.41
One of the Rockefeller-funded scientists was John Fulton. He was chairman of the physiology department at Yale University and directed the laboratory where Carlyle Jacobsen conducted his chimp experiments. Jacobsen had designed his studies to probe frontal-lobe function and to identify deficits associated with injury to this region of the brain. He was not investigating whether the frontal lobes might provide a remedy for emotional disorders in humans. However, he had made a casual observation that one of the chimps, Becky, had become calmer after the surgery, and once Moniz reported on his new operation, Fulton spun this observation for his benefit. He told the editor of the New England Journal of Medicine, Boston neurologist Henry Viets, that the surgery was “well conceived.” Why? Because, Fulton explained, it had been based on animal experiments in his lab that had shown that removing the frontal lobes prevented neurosis. This led the journal to editorialize, in 1936, that lobotomy was “based on sound physiological observations” and was a “rational procedure.”42 This same story appeared in a 1938 textbook, and soon it had become an accepted “fact” that Moniz had tried lobotomy only after the chimp experiments had proven that it was likely to work. Fulton even came to believe that story himself, proudly writing in his diary that “the operation had its origin on our lab.”43 By seeing the chimp experiments in this way, Fulton was both grabbing a share of the lobotomy glory for himself and making the point that the Rockefeller money coming to his lab was being well spent.
Another Rockefeller recipient was Edward Strecker, at the University of Pennsylvania. He’d received funds to bring advanced medical treatments into the crowded mental hospitals. Such hospitals were filled with chronic schizophrenics. Those patients were precisely the type that both Moniz and Freeman had found did not benefit from lobotomy, which seemingly would have discouraged Strecker from trying it on them. But he did it anyway, because that is what Rockefeller money expected him to do. And when he concluded that Moniz and Freeman were mistaken, that prefrontal lobotomy benefited this group as well, he—like Fulton—was fulfilling his Rockefeller mandate. Similarly, Washington University in St. Louis, Missouri, had received Rockefeller funding to create a strong program in neurosurgery. After Freeman began reporting positive results with prefrontal lobotomy, the school hired Carlyle Jacobsen as its medical psychologist. He was expected to help Washington University neurosurgeons develop better surgical techniques for lobotomy, a refinement that would minimize the deficits produced by the operation. And like Fulton and Strecker, the Washington University physicians—after fiddling with the surgical methods for the operation—were soon reporting results that indicated the Rockefeller funds were being well spent. From 1941 to 1944, they operated on 101 chronic schizophrenics said to have no hope of recovery and announced that with their improved surgical techniques, fourteen of the patients had been essentially cured, thirty had been able to leave the hospital, and none had become worse. They had developed a method for using lobotomy to help even the most dilapidated schizophrenics.
In short, all of these scientists declared results that worked for them. Their announced success ensured that the Rockefeller funds would keep flowing. And collectively, they were each pitching in to tell a story—of basic research producing a breakthrough medical treatment—that signaled psychiatry’s arrival as a modern, science-based discipline.
The influence of money can be seen in other ways as well. Neurosurgeons had been waiting for some time for an operation like lobotomy to come along. In the 1930s, they had to scramble for patients. They operated primarily on brain tumors, which were not common enough to provide most neurosurgeons with a prosperous practice. When Watts first set up his practice in Washington, D.C., he told Fulton that he expected it would take years to make the practice profitable. Lobotomy offered neurosurgeons a whole new group of patients to operate on, and it wouldn’t be difficult finding them—the state hospitals were filled with hundreds of thousands of people. When Watts presented his initial lobotomy results to the Harvey Cushing Society, which neurosurgeons formed in 1932 to promote their interests, the members responded that “these procedures should be tried.”44 They could hope to earn fees ranging from several hundred dollars to $1,500 for performing a lobotomy, attractive sums to surgeons whose annual salaries at that time might not exceed $5,000. As Harvard Medical School’s Stanley Cobb later said: Frontal lobotomy was “returning great dividends to the physiologists. But how great the return is to the patient is still to be evaluated.”45
State governments also had financial reasons for embracing lobotomy. With more than 400,000 people in public mental hospitals, any therapy that would make it possible to send patients home would be welcomed for the monetary savings it produced. In 1941, Mesroop Tarumianz, superintendent at Delaware State Hospital, calculated this fiscal benefit in detail. He told his peers at an AMA meeting that 180 of the hospital’s 1,250 patients would be good candidates for lobotomy; it would cost the state $45,000 to have them operated on. Ten percent could be expected to die as a result of the operation (mostly from cerebral hemorrhages); of the remaining 162 survivors, eighty-one could be expected to improve to the point they could be discharged. All told, the state would be relieved of the care of ninety-nine patients (eighteen deaths and eighty-one discharges), which would produce a savings of $351,000 over a period of ten years. “These figures being for the small state of Delaware, you can visualize what this could mean in larger states and in the country as a whole,” Tarumianz told the AMA.46
All of these factors fed into each other and encouraged physicians and society alike to see lobotomy in a positive light. There was money to be earned, money to be saved, and professional advancement to be had. But of course that was not the story that psychiatry could tell to itself or to society—everyone would still need to believe that the operation benefited the mentally ill. Those evaluating outcomes would have to find that the patients were better off. They did so for a very simple reason: They framed the question of efficacy, in their own minds, in a way that made it virtually impossible for the surgery to fail.
As various physicians tried the surgery, they routinely described their patients as having no hope of getting well again without the operation. For instance, Francis Grant wrote in his first lobotomy report that agitated depression renders “the life of the victim little worth living” and that without radical intervention, many “can expect no relief from their misery until death intervenes.”47 Wisconsin neurosurgeon David Cleveland said that all fifteen of his first lobotomy patients were “equally hopeless,” even though six of the fifteen were under thirty years old, and one was a sixteen-year-old boy, newly ill, whose primary symptoms were “malignant-looking withdrawal” and “silliness.”48 Watts, meanwhile, once answered critics by describing patients operated on as having descended to the level of animals: “They are often naked, refusing to wear clothes, urinate and defecate in the corner. . . . Food is poked through a crack in the door like feeding an animal in a cage.”49
That perception of the hospitalized mentally ill was accurate in one regard: It did fit prevailing societal views, arising from eugenic beliefs, about the “worth” of the mentally ill. They didn’t have any intrinsic value as they were. Nor did people with such bad “germ plasm” have a natural capacity for recovery. And given that starting point for assessing outcomes, any change in behavior that resulted in the patients’ becoming more manageable (or less of a bother), could be judged as an improvement. What could be worse than hopeless? At Winnebago State Hospital in Wisconsin, physicians used an outcomes scale that ranged from no change to slight improvement to being able to go home. They didn’t even allow for the possibility that patients might become worse. Lyerly used a similar scale: Patients could be seen as “greatly improved, moderately improved, slightly improved and temporarily improved.”50 Their outcome measurements explain why Ziegler, when tallying up the cumulative outcomes for lobotomy patients in 1943, found that 84 percent of the 618 patients had improved, and only 1 percent had “deteriorated.” Eugenic conceptions of the mentally ill had provided a baseline for perceiving frontal lobotomy as a rousing success.

A Minor Surgery

Stories of medical success have a way of spinning out of control, and so it was with lobotomy. The results announced by Strecker, Grant, and others led, in the early 1940s, to a new round of feature stories in newspapers and magazines, and the writers and editors didn’t spare the hyperbole. “Surgeon’s Knife Restores Sanity to Nerve Victims,” screamed one headline. “No Worse Than Removing Tooth,” said another. “Wizardry of Surgery Restores Sanity to Fifty Raving Maniacs,” said a third.51 The Saturday Evening Post compared lobotomy surgeons to master watchmakers, writing that they drilled holes into the brain “at just the right marks, inserting tools very carefully to avoid touching little wheels that might be injured . . . they know the ‘works’ within the skull.”52 And with the press outdoing itself in this way, the use of lobotomy exploded. Prior to the end of World War II, prefrontal lobotomy had been performed on fewer than 1,000 people in the United States. But over the next decade, more than 20,000 underwent the operation, which also came to be seen as appropriate for an ever-widening circle of patients. Some—mostly women—voluntarily sought it out as a cure for simple depression. College graduates suffering from neurosis or early onset of psychosis were said to be particularly good candidates for the surgery. Freeman and a handful of others tried it as a way to cure troubled children. Most of all, however, it became regularly employed at state mental hospitals.
Freeman acted as the pied piper for this expansion. Not only did he ceaselessly promote its merits, he developed a simplified operating technique—transorbital lobotomy—that made the surgery quicker to perform. Instead of drilling holes in the sides of the patient’s head, Freeman attacked the frontal lobes through the eye sockets. He would use an ice pick to poke a hole in the bony orbit above each eye and then insert it seven centimeters deep into the brain. At that point, he would move behind the patient’s head and pull up on the ice pick to destroy the frontal-lobe nerve fibers.53 With this new method, Freeman reasoned it wasn’t necessary to sterilize the operating field and waste time with that “germ crap.” The use of anesthesia could also be eliminated. Instead, he would knock patients out with electroshock before hammering the ice pick through their eye sockets. This saved time and added a therapeutic element, he believed. The electroshock—three shocks in quick succession—scrambled the “cortical patterns” responsible for psychosis; the surgical destruction of the frontal-lobe tissue then prevented “the patterns from reforming,” he said.
Freeman performed his first transorbital lobotomy in 1946. He could do the procedure, which he termed a “minor operation,” in less than twenty minutes. With the new approach, intellectual deficits were reduced, he said, and he touted it as a surgery suitable for those who were only mildly ill and not in need of hospitalization. People eager to be relieved of depression or anxiety could undergo the office procedure and leave a few hours later. Freeman’s principal advice to families was to bring sunglasses—they would be needed to cover up the patient’s black eyes. Other than that, Freeman suggested, patients would likely recover quickly and probably wouldn’t even remember having been operated on.
Many families traveled from distant cities to bring their loved ones to Freeman for the quick-fix surgery. The patient’s own wishes regarding the operation weren’t seen as important; rather, it was the family’s interests that were paramount. In fact, Freeman saw resistance in patients—whether they were hospitalized or not—as evidence they were good candidates for lobotomy.
Some patients come to lobotomy after a long series of exasperating treatments . . . They are still desperate, and will go to any length to get rid of their distress. Other patients can’t be dragged into the hospital and have to be held down on a bed in a hotel room until sufficient shock treatment can be given to render them manageable. We like both of these types. It is the fishy-handed, droopy-faced individual who grunts an uh-huh and goes along with the family when they take him to the hospital that causes us to shake our heads and wonder just how far we will get.54
Soon Freeman was taking his new technique on the road, intent on introducing it to state mental hospitals across the country. Traveling in his station wagon, he spent his summers traveling from asylum to asylum, equipped with a pocket set of ice picks for doing surgery after surgery. In any one day, he might operate on a dozen or more patients, screening records when he arrived and then quickly choosing those he deemed suitable. Practiced as he was by then, he could do the surgery in less than ten minutes and would charge the asylums as little as $25 for each one. To quicken the process, he would drive picks into both eyes at once, rather than one at a time, as he could then step behind the patient and pull on both ice picks to simultaneously destroy tissue in both frontal lobes, thereby shaving a few minutes off the operating time. He would perform so many surgeries in one day that his hands would become sore and his forearms would grow weary.
As part of his routine, Freeman would often train the hospital psychiatrist or psychiatric resident in the procedure. Transorbital lobotomy was so simple, he believed, that even someone with no prior training in surgery could be taught how to do it in a single afternoon. At Millidgeville State Hospital in Georgia, Dr. Lewis Hatcher described his understanding of the technique: “I take a sort of medical icepick, hold it like this, bop it through the bones just above the eyeball, push it up into the brain, swiggle it around, cut the brain fibers like this, and that’s it. The patient doesn’t feel a thing.”55
Other physicians who adopted transorbital lobotomy echoed Freeman’s argument that it was a minor operation. After conducting more than 100 transorbital procedures at Philadelphia Psychiatric Hospital, Matthew Moore determined that not only could a psychiatrist easily do the operation, but he didn’t even need any elaborate equipment or facilities. “It can be stated categorically that if this procedure is ineffectual in helping the patient it will do no harm; the patient may not be improved, but he will not be made worse.”56
Once lobotomy became commonplace in state asylums, it quickly became used as a treatment for disruptive patients who couldn’t be quieted by electroshock. The use of lobotomy at Stockton State Hospital in California, which began in 1947, exemplified this pattern.57 The first patient lobotomized there was a thirty-three-year-old woman who had undergone 450 electroshock treatments during her first six years at the hospital but still misbehaved. She swore regularly and had poor hygiene. After lobotomy, though, she turned “childlike, naïve, and quite friendly,” her new behavior much more pleasing to the staff.
Over the course of the next seven years, 232 patients were lobotomized at Stockton Hospital. California law required that the hospital obtain consent from the patient’s family, which was told that the surgery was a “delicate brain operation” and “the most advanced type of treatment that is now available.” However, in their chart records, the Stockton doctors privately expressed their real reason for recommending lobotomy: This was an operation that could turn “resistive, destructive” patients into “passive” ones. In 1949, the California Department of Mental Hygiene approvingly noted that lobotomy had been used by Stockton and other state hospitals “chiefly to pacify noisy, assaultive, and uncooperative patients.”58
The last lobotomy at Stockton Hospital was performed in 1954. Joel Braslow, in his book Mental Ills and Bodily Cures, has tallied up the cumulative results: Twelve percent of the patients died from the surgery, mostly because of bleeding in the brain. Many were disabled by seizures, incontinence, and lasting disorientation. By 1960, only 23 percent of the lobotomized patients had been able to leave the hospital, and nobody wanted to provide care for those left on the wards. During the next two decades, as part of the deinstitutionalization process, most were finally discharged to nursing homes. The hospital, putting one last positive spin on the lobotomy era, typically stamped their records with such optimistic conclusions as “improved” and “treatment concluded.”59
More than 60 percent of all people lobotomized in the United States were patients at state mental hospitals. But like any “successful” procedure, it was eventually tried on children.
In 1950, Freeman and Watts reported that they had operated on eleven troubled youths, including one only four years old. “The aim has been to smash the world of fantasy in which these children are becoming more and more submerged,” they explained. “It is easier to smash the world of fantasy, to cut down upon the emotional interest that the child pays to his inner experiences, than it is to redirect his behavior into socially acceptable channels.”60 Although two of the eleven died, three had to be institutionalized, and three others were described as “antagonistic,” “irresponsible,” and exhibiting “profound inertia,” Freeman and Watts concluded that this first trial in children had produced “modest results,” and Freeman continued to occasionally perform such operations throughout the 1950s.

A Eugenic Solution

Medical therapeutics for the mentally ill, and how they are used, invariably reflect underlying societal values. In the 1700s, European societies conceived of the mentally ill as beings that, without their reason, had descended to the level of animals, and they developed harsh therapeutics to tame and subdue them. In the early 1800s, the Quakers in York, England, viewed the mentally ill as brethren, as fellow human beings worthy of their empathy, and fashioned a therapeutic that emphasized kindness and the comforts of a good home. In the first half of the twentieth century, America conceived of the mentally ill as hereditary defectives, without the rights of “normal” citizens. That set the stage for therapeutics that were designed to alter who the mentally ill were, with such remedies to be applied even over their protests.
Insulin coma, metrazol, forced electroshock, and lobotomy all fit this model. Lobotomy simply brought brain-damaging therapeutics—a phrase coined by Freeman—to its logical conclusion. This operation, as physician Leo Alexander pointed out in 1940, was a more precise way to damage the brain:
There is agreement that the clinical improvement following metrazol or insulin therapy is essentially due to destruction of brain tissue, and that the clinical improvement caused by metrazol or insulin treatment has essentially the same rationale as frontal lobotomy. There can be no doubt, from the scientific point of view, that a method in which one knows what parts of the brain are destroyed is preferable to one in which destruction is unpredictable, at random, and more or less left to chance.61
In Germany, eugenic attitudes toward the mentally ill led to a euthanasia program. Nazi physicians perceived it as a merciful “medical treatment,” and the Nazi government set up a “medical office” to carry it out. Psychiatrists and other doctors decided which mentally ill people needed to be “relieved” of the burden of living. In the United States, eugenics led to a different end, but clearly one consistent with eugenic beliefs. It led to a quartet of therapeutics, applied regularly without the patient’s consent, that filled the mentally ill with terror, broke their bones, robbed them of their memories, and, in the manner of a partial euthanasia, “relieved” them of the very part of the mind that makes us human. The path to lobotomy, it becomes clear, began not with Moniz but with Charles Davenport and his scorn for the “unfit.” Franz Kallmann’s description of the mentally ill as individuals who were not “biologically satisfactory,” the American Eugenics Society’s catechism that disparaged the mentally ill as “cancers in the body politic,” and the U.S. Supreme Court’s 1927 decision authorizing compulsory sterilization of the mentally ill were all stops on the path as well. Metrazol, forced electroshock, and lobotomy were medical solutions consistent with a eugenic conception of the mentally ill.
However, American society has never perceived those treatments in this light. Certainly it did not in the immediate years after World War II. Doctors in Germany, shamed over the revelations at the Nuremberg Doctors Trial, viewed lobotomy with much wariness, seeing it as reminiscent of euthanasia. Freeman’s transorbital lobotomy particularly appalled them. But the view was quite different in the United States. The United States was in a triumphant mood, newly confident of its ways, and psychiatry saw in this surgery evidence of its own triumph and arrival as a modern discipline. In 1948, the American Journal of Psychiatry proudly commented that “every step of [the pioneers’] progress in this rapidly growing field is marked by a deep sense of primary obligation to the patient, and a profound respect for the human brain.”62 Mental Hygiene News adopted a darkened landscape pierced by the light of lobotomy’s torch as a symbol for its masthead—lobotomy was the beacon that had so transformed psychiatry. The New England Journal of Medicine editorialized that “a new psychiatry may be said to have been born in 1935, when Moniz took his first bold step in the field of psychosurgery.”63 And when Moniz was awarded the 1949 Nobel Prize in medicine and physiology, the New York Times hailed the “explorers of the brain” who had invented this “sensational operation.”
Hypochondriacs no longer thought they were going to die, would-be suicides found life acceptable, sufferers from persecution complexes forgot the machinations of imaginary conspirators . . . surgeons now think no more of operating on the brain than they do of removing an appendix . . . it is just a big organ with very difficult and complicated functions to perform and no more sacred than the liver.64
The tale America had been telling itself had wound its way to a wholly satisfying conclusion: Lobotomy was the fruit of both good science and a humanitarian empathy for the mentally ill.