3

image

THE SEAT OF MADNESS

Today Blackwell Island no longer exists. In 1973, the island was renamed after Franklin D. Roosevelt, and the site where Bly spent her ten harrowing days is now home to a luxury condo development. But the kind of anguish she witnessed there doesn’t just disappear. The questions she was trying to answer—questions about what it means to be sane, or insane, what it means to care for a suffering human being who often scares us—remain.

Madness has been dogging humanity for as long as humans have been able to record their own history. But the answer to what causes it—where it can be located, in a manner of speaking—has eluded us just as long. The explanation has ping-ponged throughout history among three players: mind/soul, brain, and environment. First, it was believed to be supernatural, a direct effect of meddling by the gods or devils. Thanks to unearthed skulls dated to around 5000 BC, we know that one of the earliest solutions was to bore holes in the head to release the demons that had presumably taken up residency there, a procedure called trephining. Another way to rid oneself of inner demons was to sacrifice a child or an animal so that the evil spirit could trade one soul for another. Early Hindus believed that seizures were the work of Grahi, a god whose name translates quite literally to “she who seizes.” The ancient Greeks believed that madness descended on them when their gods were angry or vengeful—a belief that continued on with the teachings of Judaism and Christianity. Lose faith or become too prideful and “the Lord shall smite thee with madness,” the Old Testament warned. In the book of Daniel, God punishes Nebuchadnezzar (“those who walk in pride he is able to abase”) by deploying a form of madness that transforms him into a raving beast, stripping away his human capacity for rational thought. Exorcisms, ritualistic torture, and even burnings at the stake were some of the approaches employed to release the devil in unquiet minds. Those who survived suicide attempts—seen as an act spurred on by the devil himself—were dragged through the streets and hanged.

Enlightenment thinkers reshaped madness into irrationality and began to think of it as a by-product of the breakdown of reason rather than an outcome of demonic possession. René Descartes argued that the mind/soul was immaterial, inherently rational, and entirely distinct from our material bodies. Though religion clearly still played a role in this thinking, this dichotomy allowed madness to become “unambiguously a legitimate object of philosophical and medical inquiry,” wrote Roy Porter in Madness: A Brief History.

This area of medical inquiry got a name in 1808: psychiatrie, coined by German physician Johann Christian Reil. The new medical specialty (which should attract only the most forward-thinking practitioners, Reil wrote) would treat mind and brain, soul and body—what is today called the holistic approach. “We will never find pure mental, pure chemical, or mechanical diseases. In all of them one can see the whole,” Reil wrote. The principles he laid out then are as relevant today: Mental illnesses are universal; we should treat people humanely; and those who practice should be medical doctors, not philosophers or theologians.

Reil’s version of psychiatry didn’t deter the many doctors who chased promises of finding the “seat of madness.” What causes it? they wondered. Is there one area or hosts of them? Can we be driven to it by circumstance and environment, or is it rooted solely in the organs within our skulls? Alienists began to target the body, expecting that madness could be isolated and targeted—creating some truly horrific treatments along the way, from spinning chairs (developed by Charles Darwin’s grandfather Erasmus Darwin) that induced vertigo and extreme vomiting that was believed to lull the patient into a stupor; to “baths of surprise,” where floors fell away, dropping people into cold water below to shock the crazy out. As brutal as these treatments were, they were considered a step forward: At least we weren’t attributing cause to devils and demons anymore.

An early practitioner named Benjamin Rush, a signer of the Declaration of Independence, believed that the cause of madness was seated in the brain’s blood vessels. This prompted him to dream up some deranged treatments, including the “tranquilizing chair” (a case of the worst false advertising ever), a terrifying sensory-deprivation apparatus in which patients were strapped down to a chair with a wooden box placed over their heads to block stimulation, restrict movement, and reduce blood to the brain. Patients were stuck in this chair for so long that the seat was modified to include a large hole that could serve as a toilet. The insane weren’t just neglected and ignored; they were abused and tortured—the “otherness” of mental illness making them fair game for acts of outright sadism.

The invention of the microscope led to descriptions of the contours of the brain and nervous system on the cellular level. In 1874, German physician Carl Wernicke pinpointed an area of the brain that, when damaged, created an inability to grasp the meaning of spoken words, a condition called Wernicke’s aphasia. In 1901, Frankfurt-based Dr. Alois Alzheimer treated a fifty-one-year-old woman with profound symptoms of psychosis and dementia. When she died in 1906, Alzheimer opened up her skull and found the cause: plaque deposits that looked like tangled-up sections of fibrous string cheese. So: Was her mental illness caused by nothing more than an unfortunate buildup?

The greatest triumph came from the study of syphilis, a disease all but forgotten today (though seeing a resurgence1) that surfaced around 1400. The famous people suspected to have had syphilis could crowd a Western civilization Hall of Fame: Vincent van Gogh, Oscar Wilde, Friedrich Nietzsche, Henry VIII, Leo Tolstoy, Scott Joplin, Abraham Lincoln, Ludwig van Beethoven, and Al Capone.

Stories of “the most destructive of all diseases” have abounded since the late Middle Ages. Doctors later called it the “general paralysis of the insane”—a group of doomed patients that made up an estimated 20 percent of all male asylum admissions in the early twentieth century. These patients staggered into hospitals manic and physically off-balance. Some under grand delusions of wealth spent all their money on ridiculous items like fancy hats. Their speech sounded spastic and halting. Over the course of months or years, they would waste away, lose their personalities, memories, and ability to walk and talk, spending their final days sectioned off to the back wards of some local asylum until death. Patient histories, when available, revealed a pattern: Many of these men and women had developed syphilis sores earlier in their lives. Could this sexually transmitted disease be a latent cause of madness?

The answer came when two researchers identified spiral-shaped bacteria called Spirochaeta pallida in the postmortem brains of those of the insane with general paralysis. Apparently, the disease could lie dormant for years, later invading the brain and causing the constellation of symptoms that we now know of as tertiary syphilis. (Syphilis would come to be called the great pox, the infinite malady, the lady’s disease, the great imitator, and the great masquerader—one more example of the great pretender diseases, because it could look like a host of other conditions, including insanity.) This was, as contemporary psychologist Chris Frith described, a “kind of peeling of the diagnostic onion.” We had parsed out something we thought generally of as “insanity” as having a physical cause. And the best part was that we could eventually cure it if we caught it early enough, too.

(Though they have different causes, the symptoms of syphilis share many similarities with those of autoimmune encephalitis, the disease that struck me, which I guess could give autoimmune encephalitis the dubious honor of being the syphilis of my generation.)

The more we learned about the science of the mind, the hazier the boundary between neurology and psychiatry became. During the twentieth century, neurology broke off into a distinct branch of medicine, and in doing so “claimed exclusive dominion over the organic diseases of the nervous system”—like stroke, multiple sclerosis, and Parkinson’s. Meanwhile, psychiatrists took on the ones “that could not be satisfactorily specified by laboratory science”—like schizophrenia, depression, and anxiety disorders. Once a biological breakthrough was achieved, the illness moved out of psychiatry and into the rest of medicine. Neurologists work to uncover how damage to the brain impairs physical function; psychiatrists are there to understand how this organ gives rise to emotion, motivation, and the self. Though the two fields overlap considerably, the separation embodies our mind/body dualism—and this continues today.

Clearly, syphilis and Alzheimer’s disease weren’t the only causes of insanity. In order to track down and cure the others—if they could be found—psychiatrists still needed to develop a diagnostic language that could help pinpoint the different types (which would hopefully lead to the cleaving out of different causes) of mental illness.

German psychiatrist Emil Kraepelin had been tackling this issue since the late nineteenth century, and though you’ve likely never heard of him, his work has had more influence on the way psychiatry is practiced today than did the famous Sigmund Freud, born the same year: 1856. The son of a vagabond actor / opera singer / storyteller, Kraepelin dedicated his life to organizing mental illnesses into orderly parts, perhaps as a reaction to such an unorthodox father. In doing so he endowed the nascent field with a new nosology, or system of diagnosis, that would later inspire the Diagnostic and Statistical Manual of Mental Disorders, the bible of psychiatry today. Kraepelin studied thousands of cases and subdivided them, breaking down what was described as “madness” into clear categories with varied symptoms as best he could. This culminated in the description of the medical term dementia praecox. Kraepelin defined dementia praecox in his 1893 textbook Psychiatrie as an early onset permanent dementia, a biological illness that caused psychosis and had a deteriorating course with little hope to improve, causing “incurable and permanent disability.” Kraepelin separated dementia praecox patients from those with “manic-depressive psychosis,” a disorder of mood and emotion that ranged from depression to mania, which had a better long-term prognosis. This division continues today with schizophrenia (and its component parts) and bipolar disorder (and its component parts). (In 1908, almost two decades after Kraepelin presented the diagnosis dementia praecox to the public, Swiss psychiatrist Paul Eugen Bleuler tested out the new term schizophrenia, which translates to “splitting of the mind,” contributing to a long-running confusion2 over the term. Later, psychiatrist Kurt Schneider further defined schizophrenia with a list of “first rank symptoms” that include auditory hallucinations, delusions, and thought broadcasting.)

Now, finally, psychiatrists could make predictions about course and outcome. Most important, they could provide a name for their patients’ suffering, something I personally would argue is one of the most important things a doctor can do, even if a cure isn’t in sight. Still, the cause remained elusive—as it continues to.

Doctors began to slice and dice their way through “insane” brains. They removed living people’s thyroids, women’s ovaries, and men’s seminal vesicles based on half-baked theories about the genetic origins of madness. An American psychiatrist named Henry Cotton, superintendent of Trenton State Hospital in New Jersey, offered a “focal infection theory” of mental illness, which posited that the toxic by-product of bacterial infections had migrated to the brain, causing insanity. It wasn’t a terrible idea in theory (there are infectious causes of psychosis), but Cotton’s solutions were a nightmare. In an attempt to eliminate the infection, he began by pulling teeth. When that didn’t work, he refused to reconsider and instead removed tonsils, colons, and spleens, which often resulted in permanent disablement or death—and got away with it because his patient population had neither the resources nor the social currency to stop him.

Clinicians and researchers also embraced the growing eugenics movement that argued that insanity was a heritable condition passed down through inferior genes. In America, thirty-two states passed forced sterilization laws between 1907 and 1937—why not stop the spread of undesirables, they thought, by cutting off their ability to reproduce? The Nazis adopted America’s science-approved sadism, sterilizing three hundred thousand or so German psychiatric patients (the most common diagnosis was “feeblemindedness,” followed by schizophrenia and epilepsy) between 1934 and 1939 before they took it one step further and began exterminating “worthless lives”—executing over two hundred thousand mentally ill people in Germany by the end of World War II.

In the aftermath of the war, as the full horror of Nazi atrocities hit the American public, the timing seemed overdue for a reassessment of psychiatry and its obsession with finding biological causes for mental illness—especially in 1955, when over a half million people lived in psychiatric hospitals, the highest number ever.

In a strange confluence of events, the same year that Kraepelin popularized dementia praecox, Freud emerged with a new theory of treating the mind called psychoanalysis. While asylum psychiatrists interrogated the body, another group of doctors, psychoanalysts, had moved so far away from the search for an answer in the physical that it was as if they were practicing a different discipline altogether. Psychiatry outside the asylum had little in common with that practiced inside. Outside the asylum, the idea reigned that the mind was the seat of all mental suffering, not the gray matter of the brain. For someone like me, so accustomed to talk of neurotransmitters, dopaminergic pathways, and NMDA receptors, the popular terms of that era, like penis envy, phallic stage, and Oedipal conflict, feel awkward and clumsy, holdovers from a quainter world. But it wasn’t that long ago when these were the norms. Every Baby Boomer alive today was born when terms like these dominated the field.

Psychoanalysis invaded the US by way of Europe right before World War II, offering up a new theory that provided fresh insight into mental anguish—and, for once, real cures—as war-weary soldiers returned from battle healthy by all physical estimations, but emotionally unable to join the workforce or engage in family life. For the first time ever, there were more recorded casualties related to the mind than to the body. It was a sobering thought: If a healthy young man could be reduced to a shaking, fearful, hysterical one without any physical cause, then couldn’t this happen to any of us?

Freud (who died before psychoanalysis really took off in America) gave us a path out of this dark forest of uncertainty. In his explanation, our minds were divided into three parts: the id (the unconscious—rife with repression and unfulfilled desires); the ego (the self); and the superego (the conscience), all engaged in battle. The analyst’s goal was to “make the unconscious conscious” and with a surgeon’s focus zero in on the underlying conflict—our libidos, repressed desires, death drives, projections, and wish fulfillment fantasies; all that deep, dark, murky stuff from our childhoods—on the way to insight. There was “nothing arbitrary or haphazard or accidental or meaningless in anything we do,” wrote Janet Malcolm in Psychoanalysis: The Impossible Profession.

And who wouldn’t want this kind of careful attention and promise of a cure over the dour inevitability that the biological side (à la Emil Kraepelin) was offering? Consider the two differing interpretations of a patient’s story as analyzed by both Kraepelin’s followers and Freud. In 1893, fifty-one-year-old German judge Daniel Paul Schreber started to become obsessed with the idea that to save the world, he needed to become a woman and give birth to a new human race. He blamed these disturbing thoughts on his psychiatrist, whom he called a “soul murderer” who had implanted these delusions via “divine rays.” Doctors diagnosed Schreber with Kraepelin’s dementia praecox and committed him to a psychiatric hospital, where he eventually died. When Freud read Judge Schreber’s account, Memoirs of My Nervous Illness, he suggested that, instead, Schreber’s behaviors stemmed from repressed homosexual impulses, not from an incurable brain disease. Treat the underlying conflict and you’d treat the person. If you had your choice, which kind of treatment would you pick? Americans overwhelmingly chose Freud, and Kraepelin and his acolytes were forsaken to the professional boondocks.

By the 1970s, nearly every tenured professor in psychiatry was required to train as an analyst, and most textbooks were written by them, too. Overnight, it seemed, analysts got “a power, a secular power, that they never had before and they never had since,” psychiatrist Allen Frances told me. You no longer went to your priest or parents; you paid an analyst to shrink you. Now “mind doctors” wanted to mine your “family relations, cultural traditions, work patterns, gender relations, child care, and sexual desire.” Psychiatrists were thrilled to leave the back wards of mental hospitals, where difficult patients had few options for cures, and instead to retrain as analysts and cater lucrative talk therapy treatments (five days a week!) to help the so-called worried well who suffered from a case of nerves brought on by modern life. The people who needed help the most were left behind as analysts comfortably cherry-picked their patients—mostly wealthy, white, and not very sick.

Americans jumped on the couch, embracing the “blank screens” of their therapists and the idea that the mind could be improved. Decades after his death, Freud’s method was suddenly everywhere: in women’s magazines, in advertising (Freud’s nephew Edward Bernays is called the father of public relations); even the CIA started snatching up analysts. America’s second-biggest bestseller after the Bible became Dr. Benjamin Spock’s The Common Sense Book of Baby and Child Care, which was based on Freudian theories. Another huge book of the moment was Norman O. Brown’s Life Against Death: The Psychoanalytic Meaning of History, which attempted to reframe the past through a Freudian battle between freedom and repression. Hollywood hired psychiatrists on retainer on movie sets. Insurance companies paid for months of talk therapy and reimbursed at levels equal to other serious medical procedures.

No matter how many psychiatrists enlisted, however, there still weren’t enough. By 1970, despite the influx of doctors, the demand exceeded the supply. Unlike the custodians of the sick in the past, psychoanalysts now promised to listen to their patients. In the best cases, patients found clarity and meaning from this relationship. Instead of pathologizing people outright, analysts saw each patient as unique in her psychic suffering. They gave us a deeper understanding of how fraught and layered our interior lives are: the complexities of sexuality; the key role that our childhoods play in our adult lives; how the unconscious speaks to us through our behaviors. Through the “interchange of words between patient and physician,” as Freud put it, you could explore, comprehend, and even heal the sick parts inside us. “Words were originally magic, and the word retains much of its old magical powers even today,” Freud wrote in 1920. “Therefore let us not underestimate the use of words in psychotherapy.”3

One of the varied downsides was that doctors enacted vivid blame games on their patients (and the families of their patients), especially on mothers. (See the refrigerator mother [lack of maternal warmth] and the schizophrenogenic mother [an overbearing, nagging, domineering female, usually paired with a weak father], both of whom were believed to create symptoms of schizophrenia and autism in their children.) Viennese psychoanalyst Bruno Bettelheim,4 “psychoanalyst of vast impact,” in The Empty Fortress in 1967 compared the family structure of those with mental illness, especially autism, to concentration camps, a particularly damning argument because Bettelheim himself had survived two years in Dachau and Buchenwald. The only way one could recover was to completely sever relationships with family.

But what you didn’t get with Freud was a focus on diagnosis. In fact, his followers practiced “extreme diagnostic nihilism.” Nomenclature, shared diagnostic language—these didn’t really matter to the analysts. In fact, psychiatrists expanded the scope of social deviance, pathologizing almost everyone in the process, effectively closing the chasm between sanity and insanity by showing that “true mental health was an illusion,” as anthropologist Tanya Marie Luhrmann wrote in her study of the profession Of Two Minds. According to a now infamous 1962 Midtown Manhattan study based on two-hour interviews with sixteen hundred people in the heart of the city, only 5 percent of the population were deemed mentally “well.” The whole world was suddenly crazy, and psychiatrists were their caped crusaders.

America was again starting to look a lot like it had in the time of Nellie Bly—where anyone could be and often was (mis)diagnosed.

And then, in February 1969, “David Lurie” walked into the intake room at an unspecified hospital in Pennsylvania and set off a metaphorical bomb. He finally proved what so many people had long suspected: Psychiatry had too much power and didn’t know what the hell to do with it.