New and Old Adventures at Harvard, Testosterone, Placebo, False Pregnancies, and Wishes and Fears Coming True

The name “Harvard” carries authority. Social policies in Norway have been swayed by Steven Pinker’s theories. A popular television show and book that telegraphed as its essential message “born that way” rehashed “facts” lifted from Pinker’s The Blank Slate for the Norwegian public as the truths of “hard” science and convinced many people of his views, including those in a position to make policy decisions. In other words, Pinker has become a representative for a discipline that has changed people’s minds about who we are, not only in the United States, but elsewhere. It is vital to stress that if his readers weren’t receptive to the message Steven Pinker and others like him bring to the public, it would drown in obscurity. Its popularity says as much about his readers as it does about him. The Blank Slate was published in 2002, but it lives on.

On January 1, 2015, in a letter to the editor, a man wrote to the New York Times in response to an article about the gap between men and women in technology fields. To bolster his argument that the problem is not an issue of discrimination but rather created by the fact that women simply don’t want to work in technology, he cites Pinker: “As discussed in Steven Pinker’s book ‘The Blank Slate,’ there is substantial scientific evidence to indicate that innate factors are a major cause of the tendency of men and women to be interested in different things. It is sheer political correctness to assume that this plays no role in why men and women tend to pursue careers in different fields.”94 I mention this letter because it illustrates how Pinker and “substantial scientific evidence” act as a seal of approval for the letter writer. Like M’s conviction that male entitlement or dominance is written in the genes, the writer to the Times refers to “innate factors” that keep women out of technology.

What is this “substantial scientific evidence”? During his roundup of sex differences, Pinker writes, “Variation in the level of testosterone among different men, and in the same man in different seasons or at different times of day, correlates with libido, self-confidence, and the drive for dominance.”95 On the heels of this statement, Pinker quotes Andrew Sullivan, a journalist who had low testosterone levels, injected the stuff, and confessed, “I almost got in a public brawl for the first time in my life.”96 Because androgens, of which testosterone is one, are involved in masculinizing the fetus in utero and the secondary sexual characteristics that appear during male puberty, and men have up to ten times more circulating testosterone than women, it has long been a likely candidate for explaining the male psyche.

In popular culture, testosterone is often seen as the “cause” of male aggression, touted as the explanation for fistfights, criminal gangs, the dog-eat-dog climate of Wall Street, and much more. Testosterone has been linked to both male and female libido, and its levels fluctuate in both sexes. Its role in human psychology remains blurry. Richard Lynn, a professor emeritus of psychology at the University of Ulster, has made an extremist career out of sex and race differences in intelligence. Lynn has been criticized by many scientists and was even chastised by the journal Nature. Pinker does not say men are more intelligent than women, and he is careful not to make a direct link between testosterone and aggression in men. He lets Sullivan’s testimony do that for him. Lynn has no such qualms about the evidence, but his ideas are not much different from the ones many evolutionary psychologists advance. The ideas don’t change all that much. The rhetoric does. In a provocative piece published in the Daily Mail, Lynn explains that men have “a natural advantage” over women. “Take, for example,” he writes, “the case of rutting stags or fighting chimps and you get the generally aggressive idea. Thanks to high levels of the male sex hormone testosterone, men are far more competitive and motivated for success than women.”97

If you castrate male mice and rats, they lose their aggressive impulses. If you replace the lost testosterone, they will fight again. I am not contemptuous of rat and mice studies and their possible relation to people. We share traits with our mammalian cousins, which is why the epigenetic studies I mentioned earlier have been seen as potentially significant for human beings, not only for rats and mice. Estrogen has also been implicated in mouse aggression. Male mice that have been genetically altered to lack the enzyme that converts testosterone into estrogen also lose their aggressive and dominant behaviors. Hormones are complex. It is interesting, however, that after heroic efforts and hundreds of studies, scientists have been unable to make a definitive link between testosterone and aggressive behaviors in human beings.

In a paper published in 2010, Scott H. Liening and Robert A. Josephs sum up the research: “Despite considerable evidence for testosterone’s connection to dominance, research on testosterone’s effect on human social behavior has been frustratingly inconsistent. Although many studies have found an association between testosterone and behavior, many others have found weak or nonexistent effects. These null findings range from competitive behaviors to aggression, both physical and non-physical.”98 A sampling of the findings: One study found that divorced men have higher levels of the hormone than men in stable marriages. Another found higher levels in women lawyers than women teachers and nurses, suggesting the hormone’s correlation with higher occupational status.99 Several studies have proposed that testosterone levels rise in response to a conflict or challenge; that is, this rise is not the cause of but perhaps the consequence of aggression. Some studies find this response only in males. Others have found it in both sexes for testosterone and estrogen: “These data . . . [provide] support for the notion that estrogen may play a significant role in the production of aggressive behavior in both sexes.”100 The author of a 2003 survey of the role of hormones in childhood and adolescence acknowledges that the studies on a relation between hormones and aggression are inconsistent. He suggests that hormones may act as “a cause, a consequence, or even as a mediator” of aggression.101

A 2010 paper by Christoph Eisenegger et al., “Prejudice and Truth About the Effect of Testosterone on Human Bargaining Behavior,” concluded that testosterone increased fair bargaining behavior in the women who took it. However, in a parallel study, if the women were told they were getting testosterone, even if they were given a placebo, they became self-serving and greedy.102 One of the researchers, Michael Naef, said, “It appears that it is not testosterone itself that induces aggressiveness but rather the myth surrounding the hormone.”103 Shall we conclude from this single study as the Telegraph did in its headline that “Testosterone Makes People More Friendly and Reasonable”?104 Perhaps rather than hardwire our way to happiness, we should inject our way there.

If there were a direct relationship between testosterone and aggressive or even dominant behaviors, one might expect a flood of the hormone to increase both qualities. The case of a four-year-old boy with an androgen-producing tumor is instructive. The child displayed pubertal changes at his tender age—penis growth, pubic hair, erections, sweating, and a precocious interest in older girls and images of naked women. The authors of the paper note that there was absolutely no sign in the boy of either aggressive or dominating behavior. On the contrary, they describe him as “anxious and withdrawn.”105 If it happened to me, I would be anxious and withdrawn, too.

Allan Mazur has studied the question of the hormonal relation to dominance and aggression in human beings for many years. At the end of his essay “Dominance, Violence, and the Neurohormonal Nexus,” he cites two of his own large-scale studies, one conducted on U.S. veterans, which found higher levels of testosterone among those veterans who were more likely to have been participants in “inner-city honor cultures” before their service than those who weren’t. The other study found no such relation. He concludes, “Like so many questions about neuroendocrinology and behavior, we do not have a clear answer.”106

None of this means that there may not be some complex relation between both testosterone and estrogen and aggression, dominance, and cooperation in people, nor does it mean that hormones play no role in psychological sex differences. A much-publicized study found that men’s testosterone levels drop when they become fathers.107 Perhaps mothers’ levels do as well, but I have been unable to find any research on that. Estrogen levels drop in women after childbirth and during breast-feeding. Androgens have been related to women’s sexual functioning, but that link is unclear. So what does this all mean? Human beings and laboratory rats are not alike when it comes to testosterone and behavior. It seems to me that a one-to-one correspondence between a single hormone and aggression, dominance, and/or cooperation is unwarranted.

In fact, the dynamic character of the human neuroendocrine system may not be well captured in any of these studies. The research further suggests that belief and the context for belief have profound effects on our bodily states, perceptions, and behaviors, a fact that turns us back to the fundamental mind-body question. Andrew Sullivan is not alone in proclaiming that infusions of testosterone made him feel brawny, tough, invigorated, and ready for battle. Nor do I doubt for one instant his veracity. There is scant evidence, however, that testosterone was responsible for those feelings. A woman I know who takes testosterone because her levels are low told me that she feels the hormone has helped her to feel less tired. Perhaps Sullivan interpreted a boost in energy through a macho perceptual lens. And yet, the question remains: If it’s not the hormone, what is it?

For centuries, physicians have been providing their patients with moral support in the form of pills, tonics, and other remedies they have regarded as medically useless. The placebo effect might be described as the beneficial effect of the nontreatment treatment. Some researchers have begun to regard placebo not as a minor irritant that interferes with the “real” effects of active drugs but as a powerful physiological reality, which may help us understand more about healing in general. Placebo has been correlated with the release of endogenous opioids (endorphins) in the brain, as well as other nonopioid changes that induce genuine therapeutic effects. Studies have documented the release of endogenous dopamine in Parkinson’s patients after placebo treatments and improvement among patients who underwent sham surgeries.108 In fact, in the study of patients given sham knee surgery, the surgery patients did no better than those who had “pretend” surgeries. One can also conclude, of course, that this particular kind of knee surgery may be a procedure due for reconsideration. A controversial but interesting study by Irving Kirsch and Guy Sapirstein found that a substantial effect of antidepressant medicines was the result, not of any active ingredient in the drug, but of the placebo effect.109 In 1996, Fabrizio Benedetti demonstrated that the release of endogenous opioids in the brain in response to placebo can be blocked by naloxone, an opioid antagonist, thereby preventing the analgesic effect.110 Sugar pills and other inert substances have also generated nausea, vomiting, headache, and other nasty “side” effects, a phenomenon known as nocebo. The question is: How does all this work and what does it tell us about the mind-body problem and the person-environment question?

In a 2005 paper, Benedetti and his colleagues define the placebo effect as “the psychosocial context around the patient.” They then write that placebo is a model for understanding “how a complex mental activity, such as expectancy, interacts with different neuronal systems.”111 Look closely at their language. First, if placebo is the “psychosocial context around the patient,” one has to ask, how does it become the patient? Nothing short of this metamorphosis needs to be explained. I have often used context in the way the authors do, as a surrounding that affects the inside, so my critique is not meant to illustrate their shoddy thinking but rather to point out how what seems lucid when it is articulated can actually be murky when it is closely examined. Second, the idea of “interaction” implies that the psychological “complex mental activity” and “neuronal systems” are somehow distinct from each other and they cross paths in an unknown way. How does the thought or expectation I am going to get better, conscious or unconscious, “interact” with neurons?

The authors’ notion of interaction doesn’t resolve mind-body dualism. In another paper, Benedetti refers explicitly to the mind-body unit. How are we meant to understand this? Does the term reinforce or undermine the mind-body division? The neuronal brain processes have a material reality, but the complex mental activity does not, or, if it does, we don’t know what or where it is and how it can interact with brain matter. It has no home in the model. Indeed, it begs the question Descartes, Princess Elisabeth, Cavendish, and many others wrestled with long ago—how can mind stuff be separate from body stuff and, if the two are separate, how on earth would they interact? What exactly is “complex mental activity” and what is it made of? Surely the authors do not mean that complex mental activity is insubstantial, made of mysterious spirits or soul. Do they mean that this complex mental activity is somewhere inside a person but separate from the neurons in her brain? Is their argument dependent on the idea that brain and mind are indeed separable? I would repeat the question Margaret Cavendish posed in her Philosophical Letters of 1664: “I would fain ask them, I say, where their Immaterial Ideas reside, in what part or place of the Body?”112 If confronted, I suspect their answer would be the one often given by neuroscientists: we do not know how psychological factors relate to neurobiological factors. This is a missing link of enormous proportions. All references to “neural correlates, substrates, and underpinnings” for fear, love, memory, consciousness, or any other state imply this gaping hole in the understanding of mind-brain processes.

I am not filling in the hole. I am pointing to what is known as the “explanatory gap” between mind and brain. Benedetti & Co. are in no way exceptional among neuroscientists in their description of two interacting levels. They are typical. The big problem is that even if we arrive at a moment when the whole brain—its synaptic connections, individual neurons, and chemistry—can be beautifully described, the gap remains. Is it reasonable to describe a person’s subjective thoughts, dreams, hopes, and wishes through neuronal processes? How do we know the two are the same thing? Could it ever be proven? Not only that, if thoughts, expectations, personality, desires, and suggestions are described as agents that change brain function, how does that work? If you are a dualist you will have to explain how these states work on neurons. If you are a monist and a materialist, you will feel deeply dissatisfied by this description. Isn’t the brain responsible for all of it? Luria regarded the immediate reduction from the psychological to the neurobiological as an error. Drawing a straight line between a person’s fear and the amygdala, for example, an almond-shaped part of the brain that is clearly implicated in fear experiences, is unwarranted. This may be true, but the gap doesn’t vanish.

Arguably, human beings are continually under the sway of placebo-like effects. The women who were injected with what they believed to be testosterone, which has been identified repeatedly as the “male” hormone, turned into caricatures of masculine brutes, as, by his own account, did the ordinarily pacific Sullivan. Placebo and belief effects open up possibilities for rethinking the mind-body and nature/nurture questions. Neurobiologists are interested in tracking the placebo effect through brain scans, relating it to particular areas (the anterior cingulate cortex, for example), but many of them skirt the uncomfortable questions placebo raises, and they usually do it by settling for a mind-body divide that makes little sense in terms of their own physiological models. In her introduction to an interdisciplinary book on placebo, Anne Harrington is more direct. After it became clear in the 1970s that expectation could trigger a neurochemical response in the brain, she summarizes the problem: “Endorphin release . . . became just one more placebo-generated phenomenon to be explained—and we still did not understand the processes whereby a person’s belief in a sham treatment could send a message to his or her pituitary gland to release its own endogenous pharmaceutics.”113 This is the conundrum precisely.

In the collection of texts on placebo she edited, Harrington includes one by Robert Ader, who writes about the experiment he and his colleagues conducted on rats in 1975. The rats were given a saccharine-flavored drinking water at the same time as they were injected with a drug that suppressed their immune systems, which made them ill and killed many of them. When a subgroup of the rats continued to get the harmless drinking water but no longer received the injections, they got sick and many died anyway. Why? The animals seemed to be under a kind of Pavlovian nocebo effect. The coupling of sugar water and deadly drug created a conditioned response, which affected their immune systems.114 They appeared to have been conditioned into immune collapse. The study showed that the noxious effects of nocebo are not limited to human beings. As far as we know, rats do not have reflectively self-conscious thoughts. They do not say, “I am going to get sick,” and get sick. They are not constantly narrating their lives to themselves. Rats, not known for reflecting on their own actions, nevertheless appear to have a version of the nocebo phenomenon.

Cases of false pregnancy have been documented since Hippocrates, and the physical changes that take place in people who believe themselves to be pregnant are objective, not subjective. There are those who suffer from delusions that they are pregnant or about to give birth, but it is obvious to everyone around them that they are crazy. The changes wrought by false pregnancy or pseudocyesis are not delusional. They include abdominal swelling that increases over time; cessation of the menstrual cycle; breast growth, often with secretions; fetal movements that can be felt, not only by the person herself but by others, including doctors; nausea; enlargement of the uterus; and changes in the cervix. There is no question that in these cases neuroendocrine and hormonal levels are altered by a wish to be pregnant. Although false pregnancy is now rare in developed countries, it is not uncommon in Africa. According to an estimate, 1 in every 160 patients who sought infertility treatment in a particular clinic in Sudan developed pseudocyesis. It is generally acknowledged that one of the reasons for the higher incidence among women in these countries is that fertility has greater social value than it does in more developed countries.115 Also the use of sonograms, now routine in the United States and Europe, is less common in Africa.116 Once a woman has been given proof she is not pregnant, the signs of pseudocyesis usually disappear.

Despite the many studies that have documented the physiological changes and speculated on various psychiatric and hormonal reasons for the transformation, there is no consistent neuroendocrine profile in these patients and no explanation for how these changes occur. A 1982 study on two patients suggested an impairment in dopaminergic function. The authors of the study do not speculate on how a wish is literally embodied in a person and can affect hypothalamic-pituitary processes.117 How does the content of a wish—I want to be pregnant—create the physical appearance and feelings of pregnancy? There have been cases of false pregnancy in men, but they are rare and have often been accompanied by a psychotic disorder.

I did find one documented case of false pregnancy in a man who was not mentally ill.118 According to the author of the 1988 paper, Deirdre Barrett, the man in question felt comfortable with the term “transsexual.” He had always felt he was a woman trapped in a man, but he only rarely dressed as a woman, had not sought surgery to alter his body, and did not use the pronoun “she.” He was still grieving for his male partner, who had died some months before, when he sought out a doctor for hypnosis. The man wanted to quit smoking. During the hypnotic session, the doctor asked his patient to imagine being the person he wanted to be, and he imagined himself pregnant. He had long had this fantasy, and although it isn’t mentioned in the report, it seems to me his wish had a sound dream logic.

After the death of his beloved partner, wouldn’t an imaginary pregnancy keep the deceased alive inside him? Isn’t it also a metaphor for his grief? Not long after the hypnotherapy, the man’s belly began to swell. He suffered nausea in the morning, noticed “watery secretions from his nipples,” and felt a rhythmic heartbeat in his abdomen. He sought medical help for what he himself identified as a “false pregnancy.” According to Barrett, the patient vacillated between knowing he was not pregnant and wondering whether he might actually be carrying a fetus. He inquired about a supposed case of genuine male pregnancy in France and brought up experiments on male mice to alter them for pregnancy. Hypnosis coupled with the wish seems to have produced the pseudocyesis.

Far more common among men is couvade syndrome: an expectant father develops some of the symptoms of pregnancy—nausea, cravings, cramps, bloating, irritability—a sympathetic (or envious) response to his partner’s changing physical reality, which his own body mimics, a kind of mysterious contagion from one body to another. In some cultures, male participation in pregnancy is ritualized and, among the men who enact these rituals, there are those who develop external signs of fetal growth. Pseudocyesis also occurs in animals. It is common in cats and has sometimes been linked to a female cat’s loss of her litter. Can a cat’s desire or loss be considered? Is pseudocyesis a form of bodily conditioning for pregnancy that includes the imagination or mental images in human beings? The endocrine system of glands through which hormones are secreted into the circulatory system is not well understood. Confident assertions about any single hormone’s relation to complex human psychology may be premature. Furthermore, the fact that hormonal levels and endogenous opioids are affected by one’s culture, by one’s beliefs and desires, should make any thoughtful person think carefully about how he frames the question of what we mean by “biology.”

How does a verbal suggestion or internal wish create mystifying changes in a person’s body? Ever since Franz Anton Mesmer scandalized the European medical community in the eighteenth century with his spectacular demonstrations of animal magnetism, science has had a queasy relation to hypnotic suggestion. It was briefly dignified by the great French neurologist Jean-Martin Charcot, who used it as a technique for demonstrating the nature of hysteria. Charcot wrongly believed that only hysterics could be hypnotized. The powers of suggestion, however, were widely regarded as a fact of human biological reality and were studied by Charcot’s younger colleague, the neurologist and philosopher Pierre Janet, and by Sigmund Freud and Josef Breuer, all of whom worked with hysterical patients.

Late in his career, Charcot said, “We now know without a doubt that, in certain circumstances a paralysis can be produced by an idea, and also that an idea can cause it to disappear.”119 According to this theory, hysterical symptoms were caused by an autosuggestion that took effect in the patient. Janet maintained that “ideas” had the power to alter neurobiology through a kind of disconnection or dissociation in the brain, an alienation of one system from another. Janet and Freud were both invested in understanding a patient’s particular emotional history and how ideas connected to a traumatic event or events created a suggestion, which was then converted into symptoms. Janet is not a household name, but his writing about dissociation is highly sophisticated, and some contemporary scientists trying to understand conversion hysteria or conversion disorder have resurrected his work.120

Unlike pseudocyesis, cases of conversion are common. One would have to search high and low to find a neurologist or psychiatrist who has not seen numbers of patients suffering from the myriad inexplicable symptoms of hysteria, including blindness, deafness, paralyses, contractures, and seizures. After a surge of medical and popular interest in the late nineteenth and early twentieth centuries, hysteria became a medical embarrassment. Interest waned, but the patients kept coming. Many of them (if they are not combat veterans) are women, and all complaints that afflict more women than men have been and continue to be treated with a mixture of condescension and contempt by many members of the medical establishment.

Recently, brain scans have propped up hysteria as a condition of renewed medical interest because it is now possible to distinguish on an fMRI or PET scan between a hysterical and a feigned paralysis, for example. They do not look the same. In fact, a conversion paralysis resembles a paralysis induced under hypnosis by suggestion.121 Charcot and Janet appear to have been right. Hysteria may be a suggestion-induced symptom, although the suggestion may not be conscious. The right temporo-parietal junction is hypoactive (less active) in people with hysterical conversion seizures compared to people who pretend to have tremors.122 This is the same area that the wizard in the Daily Mail designated our “moral compass.” The conversion patient is really paralyzed. His paralysis, however, is different from another patient who sustains damage to her spine and loses the use of her legs. Hysteria creates the same philosophical problems as nocebo or false pregnancy. How do ideas, beliefs, wishes, and fears transform bodies? Is this mind over matter? Is this a case of psychological factors interacting with physiological factors? If one accepts the reality of ideas altering bodies, what does it mean for the mind-body problem?

A related mystery appears in cases of dissociative identity disorder (DID), or what used to be called multiple personality disorder, an illness also intensely studied in the late nineteenth and early twentieth centuries and one that fell under the broader rubric of hysteria. Almost always connected to childhood trauma, the disorder seemingly causes these patients to need more than one persona to adapt to an impossible reality. The fact that this disorder turned into an epidemic in the 1980s has often been read as a testimony to its “fake,” “manufactured,” or “unreal” character. No doubt there were frauds during the high contagions of multiple personality, but it seems more interesting to ask whether traumatized people are more vulnerable to some forms of suggestion. Research has confirmed that there are physiological differences between one personality and another in people with dissociative personality disorder. They include allergic sensitivities—one personality has hay fever; another doesn’t—differences in endocrine function, skin arousal, color vision, and varied responses to the same medicine.123 What is one to make of this in terms of mind and body—or several minds in one body in this case?

Every human being has various modes of being—the social self, the family self, the private self—but most of us in the West hold on to the idea that each one of us is singular and unified. I wonder if an actor caught up in playing a character unlike himself shows similar physiological changes or not. And what about a novelist who has spent five years writing in the first person as someone very different from herself? Does the imaginative movement into another persona or into a fictional character have a measurable effect on her physiology? As far as I know, there is no research one way or the other, probably because the current theoretical framework does not permit these questions to be asked, and if a question is not asked, it cannot be pursued.

A 2007 study tracked a traumatized DID patient who suffered from cortical blindness. After fifteen years of blindness, she gradually regained vision during psychotherapy. “At first,” the authors write, “only a few personality states regained vision, whereas others remained blind. This could be confirmed by electrophysiological measurement, in which visual evoked potentials (VEP) were absent in the blind personality states but were normal and stable in the seeing states.”124 In “The Neural Basis of the Dynamic Unconscious,” Heather Berlin comments on the 2007 paper: “This case shows that, in response to personality changes, the brain has the ability to prevent early visual processing at the cortical level.”125 The brain may be able to do this, but again, how are these personality changes induced to begin with? How is the brain responding to such changes and, if the statement is posed in such a way, doesn’t that suppose that personality and the brain are somehow different? Where do we locate personality?

David Morris, who has written eloquently about placebo, advises that we think about “human beings and complex human events like health and illness as constructed at the intersection of culture and biology.”126 The word “intersection,” like the word “interaction,” is an attempt to bring together what has been kept apart: the biological person of flesh and bone and the culture and its ideas, which are outside her. The philosophical problem is not solved, however, by Morris’s intersection. Exactly how does the intersection work? Conflating culture and biology is not a solution. One might ask, when and how does culture become biology? Or more radically, how is biology cultural? Or, perhaps better, how are ideas embodied? Hysteria, dissociation, pseudocyesis, placebo, and nocebo are fascinating because they force a reexamination of not one but several conventional concepts—those boxes into which scholars of all sorts pack up their knowledge and hope to God the contents don’t begin to stir and jump out at them in the middle of the night.