17

“We Have Done Pitifully Little About Mental Illnesses”

War, to paraphrase Clausewitz, is the continuation of insanity by other means. The twentieth century exemplified this as no other period in history.

The century just past accelerated developments in ever-newer, ever more mechanized, ever more terrifying and destructive weaponry. It made routine the strategies of “total” warfare, which metastasized from the ancient practice of relatively small professional armies facing one another across rural fields into the spectacle of million-man juggernauts sweeping across terrain and through towns and cities, butchering, burning, and raping indiscriminately.* (Both of these profanations had been tested in the latter stages of the American Civil War.)

Infantry of the twentieth century were the first to employ poisonous gas and flamethrowers, and death camps where tens of thousands of civilians perished in gassy fumes. The century introduced programmed starvation, mass machine-gunning of noncombatant prisoners, torture, and sadistic bloody experiments on the bodies of conscious victims. The century introduced the aerial bombing, and then the firebombing, of great metropolitan centers. The twentieth century introduced the atomic bomb.

One might expect to learn that the effects of these routinized atrocities upon the mental health of combatants and civilians of the twentieth and twenty-first centuries are incalculable. But that is not strictly true. They are at least roughly calculable, thanks to improvements in the sciences of tabulation and statistical sampling.

It is possible to say with assurance, as did two psychiatric researchers writing in the journal World Psychiatry, that “among the consequences of war, the impact on the mental health of the civilian population is one of the most significant.” And that studies of general populations “show a definite increase in the incidence and prevalence of mental disorders.” And that “women are more affected than men.” And, of course, that the most vulnerable groups include children and the elderly.1

Certainly war’s effects have always been keenly felt by veterans and their families. Poets, correspondents, priests, medics—all have likewise experienced the shock of battle firsthand and witnessed its power to derange the human mind. Yet before the unpopular Vietnam War, nobody said much about it beyond these tight circles. The military code of manly silence, the hard strictures of patriotism, and medical ignorance of how the brain works made it all but unthinkable for a combatant to admit that he or she had become unhinged. Any soldier foolish enough to give voice to such unmanly sniveling risked being mocked by an officer or slapped around by a three-star general, then reviled as a coward and physically kicked out of the medical tent. The veterans brought their psychic wreckage home with them, awoke screaming in the night for decades, soaked their pajamas with sweat and urine, drank themselves insensate, filled themselves with pills of all kinds, brawled with their wives or in the workplace—and, all too often, ended the torture by killing themselves.

It took a president who himself had known intense, highly mechanized combat—had known it as he screamed out artillery orders in the chaos of a massive and decisive battle—to awaken the federal government and push it, for the first time in its history, into a role of responsibility in the financing and the care of warfare’s psychic casualties—and from there to a share of responsibility toward all of the nation’s mentally ill.

The president who spurred Congress into action on mental health was Harry Truman. On November 19, 1945, before a joint session of Congress, Truman declared:

Truman’s call to action on mental illness came as the embers of history’s greatest, ghastliest war still glowed in Japan and Germany. The speech was an elaboration of a bold transitional challenge that Truman had issued two months earlier, on September 6, just four days after the United States accepted Japan’s surrender that formally ended the war.

Truman’s September address to Congress had been shrewdly timed. Less than a year earlier the small Missouri senator had only reluctantly accepted the summons of Franklin Delano Roosevelt to join the Democratic ticket and run for vice president. In office only three months, he’d been thunderstruck at the news of Roosevelt’s death on April 12. But he had regained his combative self-assurance in time to drop two atomic bombs on Japan in early August, forcing the end of World War II. Now Truman was at his full cruising speed. He’d hammered out an ambitious peacetime domestic agenda, and he wanted it enacted sooner rather than later. Perhaps calculating that some of its items were more radical than those even of his late beloved boss, Truman chose a moment when America’s patriotic exhilaration was at the flood, and when grief over the lost national father—who after twelve years in office had become “virtually the presidency itself,” in the words of David McCullough3—was still fresh. Thus he rolled out his agenda under the title that the dying Roosevelt had introduced in his State of the Union address on January 11, 1944: a “Second Bill of Rights.”

In his own version, Harry Truman ventured where not even FDR had dared to tread. His left hand chopping the air in emphasis, Truman all but dared Congress to enact a program that had already stirred up fiery partisanship, and would continue to do so for the rest of the century: compulsory, federally administered health care.

Truman anticipated the hornet’s nest he had stirred up. And yet, instead of pouring all his energies into the idea’s defense, he pivoted to another offensive on November 19: he introduced his crusade for government mental health funding.

Thus Truman widened the definition of health to encompass the brain.

If American outreach toward its insane and psychologically troubled population can be said to have experienced a golden age, it began here.

The president had little chance of seeing his larger dream realized. Universal federal health care, long since common in western European countries, had appealed to Americans as well in the years before World War I. Theodore Roosevelt had run on it in his 1912 “Bull Moose” campaign to recapture the presidency, but he lost the election. Franklin Roosevelt sent signals in 1933 that he would revive the idea as part of his Social Security bill, the centerpiece of his transformative “hundred days” of New Deal legislation. But he backed off. By then, federal health care had aroused the snarling attention of a powerful natural enemy, the American Medical Association.

The AMA had organized in 1847 in Philadelphia to rescue the public from the quackery, ignorance, and shambolic training standards that were rampant among physicians. Less than a century later, reformers were wondering whether the public needed rescuing from the AMA. The organization had hardened into an interest group: one dedicated to enriching its member-doctors and borrowing the ruthless tactics of the newly fashionable political consultants to smear those who might encroach upon its interests. “Those” included presidents. When he learned that the AMA was planning to attack his federal health-care initiative, Roosevelt withdrew it from the bill—more out of annoyance than intimidation. He would get to the issue another time. As the White House physician, Ross McIntire, told a colleague, “The president knew that the American Medical Association would stir up opposition… there is no way of appeasing that crowd.”* 4

But Roosevelt died before that “other time” arrived. And when the stubborn, defiant Truman took it up—in 1946 and again in 1950—“that crowd” hit him with all it had.

“This is not socialized medicine!” Truman had insisted in his September 6 oration. “The American people are the most insurance-minded people in the world! They will not be frightened off from health insurance because some people have misnamed it ‘socialized medicine.’ I repeat—what I am recommending is not socialized medicine! Socialized medicine means that all doctors work as employees of government. The American people want no such system. No such system is here proposed.”5

The AMA would be the judge of that. The group bellowed that it damn well was “socialized medicine,” and by-god “anti-American” in the bargain. As for the president and his fellow communistic-type pinkos in the administration, why, they were nothing but “followers of the Moscow party line.” Before the AMA was finished, Republican senator Robert Taft of Ohio was baying that compulsory health insurance came out of the Soviet constitution, and Republicans were boycotting congressional hearings. Lest anyone was still a little hazy as to where the AMA stood on the issue, one of its members quoted Lenin himself to the effect that “socialized medicine is the keystone to the arch of the Soviet state.” At any rate, all this reasoned argumentation was good enough for Congress, which voted federal health care down.

Truman seethed. “I put it to you,” he railed during a campaign stop in Indianapolis during his famous come-from-behind reelection campaign in 1948. “Is it un-American to visit the sick, aid the afflicted or comfort the dying? I thought that was simple Christianity!”6 Almost no one believed that Harry Truman stood a chance of winning that election against Thomas E. Dewey, so he might have been wise to put the issue aside and come back to it later.

But Truman had no intention of ditching his federal health-care crusade, even though it likely cost him precious votes. He squeaked through to victory anyway.

Yet he still couldn’t beat the AMA. In 1950, when Truman reintroduced his federal health-care measure to Congress, the AMA counterattacked with the most vicious propaganda onslaught that money could buy. The group paid $1.5 million to the wholesome-looking California husband-and-wife team of Clem Whitaker and Leone Baxter, the founders of Campaigns, Inc., the nation’s first political consultancy firm. Whitaker and Baxter struck the template for nearly all political consultants to come, flooding national and local media with relentlessly dishonest facts about federal care and playing to Americans’ worst instincts. They managed, for instance, to link “the opiate of socialized medicine” to both Hitler and Stalin.7

Such was to be the fate of health-care reform bills and initiatives for the ensuing sixty-five years, until the association decided to support the Barack Obama administration on the Health Care and Education Reconciliation Act of 2010.

Viewed in this context, it is a tribute to Truman’s bulldog willpower that he did not withdraw from the fight, but planted his two-toned shoes and launched his second, equally radical parallel campaign: federal guarantee of financial protection for treatment of the mentally ill. Here, he drew on strong allies. He had the indispensable support of many psychiatrists inside and outside the armed forces. These experts testified powerfully in Congress for the cause. The leading advocates were the two most celebrated American psychiatrists of the time, the Kansas-born Menninger brothers, Karl and Will. We will shortly examine the history-changing careers of these two forces of nature.

In 1946, Truman signed the National Mental Health Act, which provided federal funding, for the first time ever, for research into the human mind. William Menninger, who by then was the head of the neuropsychiatry division of the US Army, helped draft the act. A key argument made by Menninger and others was that an infrastructure of sound psychiatric counseling would end up saving money when measured against the tremendous costs to society of incarcerating the insane. In the words of the historian Ellen Herman, they were “advocating that mental health, rather than mental illness, be the centerpiece of federal policy.”8

The act led to the formation of the National Institute of Mental Health in 1949. NIMH is now the world’s largest research organization that is devoted to mental illness. Its annual budget is $1.5 billion.

President Truman’s fight to guarantee public financing in mental health care was squarely in line with the progressive Democratic ideals of his time, and, in that sense, unsurprising. But Truman brought a special understanding to the enormous bulge in the numbers of mentally damaged Americans that World War II had produced. Truman knew about what happened to combatants in twentieth-century warfare. He’d been one of them.

First Lt. Harry Truman had arrived in France in March 1918 with the 129th Artillery Regiment. He rose to the rank of captain, took command of Battery D, and directed artillery fire from forward positions during the horrific Meuse-Argonne Offensive the following autumn. It was the largest (and the most climactic) American engagement of the war, pitting twenty-two US and four French divisions against forty-seven German divisions across a twenty-mile front. The Thirty-Fifth Division, of which Battery D was a part, went into battle with 27,000 men and took 7,300 casualties, the highest rate suffered by any American division in the war.9

Adding to the casualties caused by bullets and shells was the unholy noise generated by the machines that fired the bullets and shells. World War I was history’s first battle, the Civil War perhaps excepted, in which sound itself was a debilitating weapon—but a weapon that did not take sides. Battery D’s four 75mm howitzers contributed their small part to a universe of acoustic hell that often reached decibels of 140 to 185 or more, levels that ripped men’s eardrums open and could be heard in London, two hundred miles and across a sea channel to the west.10 The maximum tolerable decibel level over a several-hour period is currently held to be about 85.11

In one of war’s infinite little ironies, the mission of Truman’s battery was to provide support for a nearby light tank brigade commanded by a captain named George S. Patton. Patton was destined for glory and an ambiguous legacy in World War II: “ambiguous” because his heroic record of lightning advances at the head of his Third Army was marred by two incidents in Sicily in which he slapped soldiers in hospital tents for their “cowardice.” Patton kicked one of these men out of the tent and drew a pistol on the other. At least one of these soldiers was recovering from shell shock; the other was later diagnosed as having “malarial parasites.”*

World War II increased the din and its tortures to the psyche. Its combat arms were more varied and more powerful than ever: The tank, a marginal presence in the first war, now saturated the battlefield. Its 90mm guns fired at 187 decibels. The new howitzers were even louder, at 189 decibels. Recoilless rifles reached 188 decibels, machine guns 155 decibels, and even a submachine gun could generate 160 decibels.12

During engagements, all or most of these battlefield Frankensteins could be in full roar at the same time, for hours, along miles of front, on both sides of the lines. Their racket approached physical dimensions. Some soldiers believed that they could actually “see” the noise as it curled over them like a giant wave. The mere concussions of exploding shells gouged deep craters. Given all this, it seems miraculous that any combatant could survive ten minutes inside this hell with his sanity undemolished, much less an entire campaign or the war in full. (The madness, of course, was hardly generated by noise alone. Fatigue, anxiety, fear of death, grief over the loss of a comrade or the horror of shooting an enemy—these and other factors did their share in separating fighting men from their senses.) Whatever the causes, the incidence of mental flameouts proved to be double the rate of World War I.

The war’s effects on the human mind produced even more insidious consequences. Like Patton, many officers assumed that twitching, convulsing, or fetally positioned men without visible wounds were faking trauma to get out of combat. The captains, majors, and generals ordered these wrecks back into the line, thus heaping humiliation on top of their jangled psyches.

The vast majority of these “fighting men” in that war, of course, as in all wars, were, and are, boys in the peak years of their susceptibility to schizophrenia.

The Nazi atrocities of human experimentation, revealed to the world in the Nuremberg “Doctors’ Trials” in December 1946, abruptly revoked the popular prestige that eugenics had enjoyed since around the turn of the century. The mentally ill have mostly been spared this particular form of mass torture since the first liberating British tanks rolled into Bergen-Belsen.*

Yet the demise of eugenics did not spell the end of suffering under “the lights of perverted science” for America’s mentally ill. Even as World War II—at long last—laid bare the simplistic assumptions of eugenics theory and the moral depravity inherent in its practice, the war pushed an even more outrageous pseudoscience into the mainstream of psychiatric “cure.” That perversion of the healing arts was called the lobotomy.

The modern lobotomy—the back-alley abortion of brain surgery—had been conceived as an antidote to schizophrenia in 1935. Its inventor was a Portuguese neurosurgeon, as he styled himself, named António Egas Moniz.* Moniz called what he did “leukotomies” because he was after white matter—as in brain tissue—and leukos means “white” or “clear” in the ever-dignifying Greek.

Diagnosis was imprecise in those years and would remain so for a long time. No one in the 1930s, as we have seen, had as yet established a baseline for differentiating insanity from severe psychological problems. Thus there was no way to verify that Moniz’s patients—twenty hospitalized and helpless men and women—were in fact insane. As for a cure, no one really had a clue. Lobotomy made as much sense as electroshock, insulin coma therapy, even “refrigeration” therapy. These and other untested methods were being rushed into operating rooms as fast as doctors and tinkerers could dream them up.

Moniz came to believe that his patients’ common problem was an oversupply of emotion. Moniz did not have a lot of training in neurosurgery. In fact, his new technique helped create the concept. He knew a little about the brain’s geography, just enough to theorize where the emotional “off” switch was located. He hit upon the idea that had some crude nineteenth-century provenance: drilling holes into a patient’s skull, then poking inside with a long thin rod to probe the edges of the frontal lobe. The rod had a small wire attached to the business end. When the doctor gave the rod a twirl, the wire would sever the long nerve fibers that link the frontal lobe with the emotion-producing parts of the brain, the limbic system.

Moniz believed this could neutralize psychosis. And he was right; it could, and did, and often neutralized the patient’s memory, personality, and, sometimes, the entire patient as well. Accidents happen.

Moniz won a Nobel Prize.

It took less than a year for Moniz’s brain-scraping technique to make its inevitable way to the United States, a continental seller’s market for cures. Its importer and promoter was a goateed and dapper Washington doctor named Walter Freeman. Freeman was a brain surgeon in the manner that Professor Harold Hill was a marching band consultant. In fact, he wasn’t a neurosurgeon at all; he was a neuropathologist, and thus no more qualified to stick things into people’s heads than Moniz. So he hired a qualified sidekick named James Watts to handle the drilling and twirling.

Freeman seems to have decided that the European product was underperforming somehow; it could use some American pep and zip. He rebranded it “lobotomy,” perhaps to carve out some marketplace distinction. Lobos means “lobe” in Greek, and is every bit as classy as “leukos.” After several years of directly replicating Moniz’s approach via Watts, Freeman hit upon a way to make the operation more user-friendly, plus eliminate the middleman. Why not just slide the rod in under the eye socket? No sheepskins necessary for that! Freeman saw that he needed a thinner rod than Moniz had used. He settled on an ice pick—one that he’d found in his kitchen drawer.

The pick needed a couple of knocks from a hammer to get it started, but once inside, it was as easy as one, two… what comes after two?

Freeman named this refinement “transorbital” lobotomy. Watts, now superfluous and finally repelled by it all, fled. No more middleman.

And no problem! Walter Freeman could handle everything on his own. He was a natural publicity animal. (It was he who had nominated Moniz for the Nobel Prize in the first place.) He honed a personal style that set him apart from the pack: he never washed his hands before an operation nor wore a mask during it. He disdained anesthesia for his patients. He performed up to twenty-five lobotomies a day. Sometimes he performed two simultaneously, one with each hand. Often, he would invite audiences into the operating room, including the press: an archival photograph in the Wall Street Journal shows him gripping an ice pick dagger-style, his head cocked in rakish preparation, as observers crowd in. Sometimes he had a bad day at the office. A couple of times the tip of the pick broke off and lodged in the patient’s skull. (Oops.) Even more embarrassing, Freeman once looked up from his patient into a photographer’s lens, lost his concentration, and let the pick slide too deeply into the brain. The patient died.13 The photograph turned out well.

This unfortunate victim thus joined the estimated one-third of Freeman’s patients whose cases the doctor himself admitted were “failures.” Not all died; some simply lost all affect, or were bedeviled by seizures, incontinence, or emotional outbursts.14

Ethically conscious doctors and surgeons were appalled by Freeman’s method, not to mention his style. They pointed out that no medical literature existed to verify its legitimacy or warn of its side effects. Certainly Freeman provided none.

A few thoughtful souls did step forward to excoriate him. In 1948, Nolan Lewis, director of the New York State Psychiatric Institute, demanded of his colleagues: “Is quieting a patient a cure? Perhaps all it accomplishes is to make things more convenient for those who have to nurse them. The patients become rather child-like; they are as dull as blazes. It disturbs me to see the number of zombies that these operations turn out. It should be stopped.”15 The great mathematician and social theorist Norbert Wiener took a similar line of attack that same year: “Prefrontal lobotomy… has recently been having a certain vogue, probably not unconnected with the fact that it makes the custodial care of many patients easier. Let me remark in passing that killing them makes their custodial care still easier.”16

Such condemnations were met with the same judiciousness, compassion, and restraint that had greeted eugenics and “scientific racism”: in 1949, civilian and military doctors across the United States were twirling away to the tune of an estimated five thousand lobotomies a year.17

What under the stars kept this P. T. Barnum of the brain propped up as a legitimate doctor for so long? (His career lasted thirty-two years before his recklessness finally caught up with him.)

The law could not touch him. No laws existed to prohibit lobotomy. No such laws exist today. But the larger reason for Freeman’s impunity derived from need. Specifically, it derived from World War II: the war, and the unprecedented numbers of deranged veterans—both men and women—that this global charnel house was disgorging back to the United States. They had been streaming home, or directly into military hospitals, since Pearl Harbor in late 1941. By war’s end, around 680,000 of them had been wounded in combat. Those were the physically wounded. What truly shocked the populace, as well as psychiatrists, was that almost three times as many veterans, some 1.8 million, had come home needing treatment for wounds to their minds.

For a while in the postwar years, the Veterans Administration hospital psychiatric chiefs tried to keep Freeman at bay. But the overwhelming stream of needy patients soon made it impossible for them to be, as it were, picky. They held their noses and allowed him and Dr. Watts over the threshold. Each man was soon raking in $50 a day—$678 and change in 2016 currency—in consulting fees; that is, fees for teaching other doctors how to tap, shove, and twirl.

When the supply of raw material in the VA hospitals around the country at last began to taper off, Walter Freeman realized that he needed to create a new market. So he purchased a van, christened it “The Lobotomobile,” and went haring around the country, stopping at mental hospitals to do his specialty and, again, to demonstrate it for the resident doctors. It really wasn’t all that hard. A no-brainer, so to speak.

Not until 1967 did the medical community decide that it had had about enough of Walter Freeman. Doctors informally agreed to relieve him of his operating-room privileges. This decision was reached after the woman who proved to be his last victim died from a brain hemorrhage—on Freeman’s third intrusion into her skull. By the time of his own death in 1972—of cancer—Freeman had directed or performed thirty-five hundred operations.

Lobotomy did not expire with Freeman, but it became extremely rare. The antipsychotic drug revolution, which had started in the 1950s, gradually replaced it as a more humane form of mass treatment. The most eloquent eulogy was written by Stephen T. Paul, professor of psychology and social sciences at Morris University in Pittsburgh: “Lobotomy was finally seen for what it was: Not a cure, but a way of managing patients. It did not create new people; it subtracted from the old ones. It was an act of defeat, of frustration.”18

Walter Freeman and his ghoulish fad aside, the early postwar years marked one of the few eras in which the United States seriously engaged the problem of madness amid its populace. It didn’t last long, and it was abruptly supplanted by a kind of Dark Age from which the momentum of public policy has yet to recover. But for a time at least, serious professionals seemed to be on the verge of wresting the fate of mentally ill people from the control of quacks, deluded ideologues, and callous public servants.

The most legendary among them hailed from Topeka, Kansas: the above-mentioned Menninger brothers, Karl and William. These sons of an old-fashioned Presbyterian town doctor and a pious, domineering mother were big men with high domes and prominent beaks and straight-arrow values—well, mostly straight-arrow values. William, born in 1899, became a lifetime Sea Scout. Karl, older than Will by six years, liked to equate mental health with moral health, and occasionally salted his books with pious exhortations. In Whatever Became of Sin? he enjoined men of the cloth to “teach! Tell it like it is. Say it from the pulpit. Cry it from the housetops… Cry comfort, cry repentance, cry hope. Because recognition of our part in the world transgression is the only remaining hope.”19

Evangelistic in their boosting of psychiatry; driven, paternalistic, and brilliant, the two accomplished something that probably no one else among their countrymen could have managed. They rescued psychiatry from the liabilities that were threatening to extinguish its early-century cachet (its taints of Europeanism and elitism on the one hand; clowns such as Freeman with his ice picks on the other). They replaced this imagery with their own stamp—then unique among US psychiatrists—of home-cooked American optimism regarding mental cure, flavored with their entrepreneurial genius. In truth, their conception of psychiatry was destined for obsolescence. Paradoxically, they accomplished this with a staff liberally stocked with German-Jewish psychiatrists who had fled the encroaching Third Reich.

It all started in 1919, when Karl Menninger returned to Topeka from the Harvard Medical School, where he’d graduated cum laude. His mission was to help his father, Dr. Charles Frederick Menninger, establish the Menninger Diagnostic Clinic. Karl was twenty-six then, and William was twenty. The clinic welcomed patients with emotional and “psychological” problems, though years would pass before the brothers could afford to include psychoanalysts on their staff.

For a while, it seemed that there might be no staff—and no clinic, either. The Wicked Witch of the West herself could not have been less welcome in this respectable Kansas town of fifty thousand people and eighty-odd churches than doctors who opened their doors to “maniacs.” Even though the family was known, several upstanding citizens tried to sue their clinic out of town. It didn’t work, but the Menningers’ persuasive powers did, though father and son had to smuggle their patients in under fake diagnoses until everybody calmed down. It helped that Charles Frederick Menninger was a reputable physician, a homeopathy man, which suited the region’s self-reliant traditions. People began to notice that his son spoke in new and fresh and reassuring ways, unlike that gloomy sex-minded Freud over there in Europe. Karl promised “a psychotherapy for the people” and a movement toward “progressive analysis” (which meant roughly the same thing).

The clinic gained popularity, local investors got interested, the father and son attracted psychiatrists who at first had been skeptical, and within five years the clinic had become the Menninger Sanitarium. Starting out in a converted farmhouse with thirteen beds on Southwest Sixth Street, it grew into a nationally known enterprise that spread to over 430 acres on two campuses. Its staff grew to nine hundred.

They cared for patients housed in thirty-nine buildings, including an administration building with a clock tower. Patients were encouraged to linger for months, even years, if they could afford it. These lengthy stays had a self-selecting effect on the clientele: movie stars, politicians, even political officeholders came for treatment. (The brothers were not in fact elitists; their aim was to get psychiatry ingrained into the nation’s cultural fabric. On the other hand, a movie star was a movie star.) In time, the sanitarium became a de facto salon as well; it attracted psychiatric intellectuals and social activists from around the world for formal and informal talks and debates.

Tower, setting, and philosophy of treatment—which emphasized the humanity of the patient, her comfort, exercise, and intellectual stimulation—all of this resonated strongly with the waning moral care era. Patients and their families arriving at Topeka by train or over dusty roads, perhaps after hours or days of chugging along through a dry, blank dust-bowl landscape, were greeted by a billboard whose message stood apart from the ubiquitous Burma-Shave signs: “WELCOME TO TOPEKA, KANSAS, THE PSYCHIATRIC CAPITAL OF THE WORLD.”

An important event in the rise to national fame was the publication of Karl Menninger’s debut book in 1930, The Human Mind. The intended readership was medical students, but this was among the first books on Freudian-derived psychiatry to be written by a professional yet in language that lay readers could understand. It was a call to liberate the mentally ill from the shadows of “otherness.” It advocated the inclusion of psychiatric principles into the professions, education, and everyday life. It boldly gave voice to a truth that not many people were comfortable contemplating: few if any differences existed between mental asylums and jails.

Most audaciously for that era of entrenched stigma and fear of “maniacs,” The Human Mind maintained that the differences between mentally ill and normal were matters of degree, not of kind. Neurological science would later demonstrate that this assertion was but partially true at best, and naive. The chronic diseases—schizophrenia and its related disorders—were indeed beyond the healing power of Freudian “talk” therapy alone, the root system of Menninger’s approach. Yet Karl was convinced that psychotic illness was reversible. He was a big fan of Freud, though he disagreed with the emphasis the Master placed on sex as a font of human motivation—at least publicly. Privately, he ratified it several times.

He had undergone Freudian analysis in 1930. Inspired, he’d traveled to Vienna in 1934 to meet Freud and discuss his methods. Freud kept him waiting and then treated him, as the great Ring Lardner put it in another context, like a side dish he had not ordered. Menninger went home mad. Still, Karl’s message on “degree” had value that ordinary people could understand and respect, even as it assaulted their prejudices. Here at last was a point of view that demanded dignity and acceptance for those selfsame “maniacs” of the town, the state, the nation, the earth.

Karl Menninger would write eleven books in all. His first led to a long-standing advice column in the Ladies’ Home Journal, which further cemented his rapport with middle Americans. This in turn earned him the folksy nickname Dr. Karl, an honorific that only after several decades would be bestowed (or self-bestowed) again, this time upon Dr. Phil.

He was a complicated man, and his complexities increased as he aged. While avuncular in his column and charming in his public appearances or when hobnobbing with the likes of Eleanor Roosevelt, Margaret Mead, Aldous Huxley, and Hollywood celebrities, Dr. Karl could be a dour, demanding, irascible man away from the spotlight. His Vulcan-like personality could intimidate underlings. Sometimes even dignified Viennese doctors on his staff felt his sting. “He was… quite arrogant and immensely abrasive,” recalled one of them.20

Over the years these spells of crankiness hardened and played their part in his downfall in the institution he had created.

Karl being Karl, the role of public ambassador for the growing enterprise was left to his younger brother, and Will Menninger was born to that task. He joined the family business after graduating from Cornell College of Medicine in 1924 and studying psychiatry at St. Elizabeth’s Hospital in 1927. In 1941, anticipating the imminent need that war would produce, Will assisted Karl in creating the Menninger Foundation for Psychiatric Training and Research. The following year Will was appointed director of the Psychiatry Consultants Division in the office of the Surgeon General of the United States Army. He oversaw the upgrading of the US classification of mental disorders. This document standardized the process by which Army psychiatrists evaluated the mental health of masses of new servicemen and psychically damaged veterans. It was adopted by all the armed services.

By 1944, Will had risen in the Army to brigadier general and chief of Army neuropsychiatry. He knew that the war’s end would soon increase the flood of “battle fatigue” cases, as they were still called. He issued a call for federal support in an initiative to train and hire hundreds of psychiatrists and staff to process the onrush.

His next step placed him in his historic alliance with President Truman. On July 3, 1946, Truman signed the act that created the National Institute of Mental Health. Will was among the chief architects and most persuasive lobbyists for this partnership with the federal government.

The early postwar years proved as needful of their profession as the Menningers had anticipated. As the psychoanalyst and author Kate Schechter has written, “Medically oriented, psychoanalytically trained psychiatrists like William Menninger spearheaded the rapid buildup of psychiatric forces during and after the war, and they soon found themselves at the top of a pyramid of mental health manpower and resources, directing research programs, university departments, and hospitals.”21

The Menningers symbolized psychiatry’s brief golden age. Thanks largely to them, not only the armed services but the American public was embracing the mental healing professions as never before. Psychoanalysis, once scorned as arcane and fraudulent, had become a middle-class status symbol; virtually a consumer product.

In retrospect the golden age was not all that golden. The various Menninger clinics and sanitaria, justly celebrated for their professionalism and abiding decency toward their patients, presented a misleading picture of asylum life in America. It was as awful as it had always been, for the most part. In cities and towns across the country, the mentally ill continued to be mistreated, tortured, deprived of warmth and fresh air and healthy food and human sympathy. A succession of investigative journalists, both print and broadcast, was about to shine its lights into these caverns of atrocity.

The result of this scrutiny, however, would be all too glumly familiar to the universe of the insane: unintended consequences.

In their heyday, Karl and Will Menninger had performed miracles. They had been instrumental in healing, or at least easing, the suffering of tens of thousands of veterans from the psychic damage of World War II. They had resurrected and sustained, for a while, the highest principles of moral care. They had managed the unthinkable task of budging the great American middle class off its great American hindquarters and persuading it to attend to its mental health. Psychiatry for the masses at last was a respectable commodity. As for those kooky people who had to be locked up in “insane asylums”—well, somebody was doing something for them. Weren’t they?

The early postwar years were a time as Dynaflow-driven as a Buick Roadmaster (if one could take one’s mind off nuclear annihilation, at least). Psychiatric care was just as comfy as Linus’s security blanket (a “transitional object,” in the hep new lingo). In the words of the psychologist and scholar Jeremy Safran, the friendly neighborhood shrink “became a purveyor of conservative American middle class values rather than a culturally subversive force.” Safran added acutely, “Mental health, by extension, tended to be defined in terms of conformity to those values.”22

As the 1960s began, some new varieties of “culturally subversive force” were abloom in the nation. Dissent against authority spread, widening its targets: the New Left’s consolidation at Port Huron in 1962, the formation of the counterculture after the assassination of President Kennedy in 1963, the Free Speech movement at Berkeley in 1964, bloody race riots in Selma and Montgomery in 1965, followed by the first anti-Vietnam student march in Washington, followed by the first urban race riot (Watts). The Rev. Dr. Martin Luther King Jr. was gunned down in April 1968 and urban rioting lasted for days; Robert Kennedy was assassinated in June. The women’s movement was launched with a demonstration at the Miss America pageant in Atlantic City. Antiwar crowds rioted at the 1968 Democratic Convention in Chicago. Yale University broke tradition by admitting women. The Weathermen staged their Days of Rage in Chicago. People began to wonder whether the whole world was going crazy. (And, as mentioned, Thomas Szasz arose to tell them it was not!) Soon, though, that fanciful question became a serious proposition, and a justification for many to celebrate individual madness.

Eclipsed by these history-changing events, nearly all of which dealt blows to traditional authority in government, race relations, education, and family—eclipsed, and virtually forgotten—lay the archipelago of the insane. And the fragile archipelago fragmented even further.