37
THE WAGES OF REPRESSION

When Dr. Michael Gottlieb, from the University of California at Los Angeles, arrived in Washington in the second week of September 1981 for a conference at the National Institutes of Health (NIH), he was optimistic that the American medical authorities were finally taking seriously a new illness that, he feared, could soon reach epidemic proportions. The NIH is the world’s biggest and most powerful medical organisation. Housed on a campus of more than 300 acres, ten miles northwest of Washington, D.C. in the Bethesda hills, it had by the end of the century an annual budget of $13 billion and housed among other things the National Institute for Allergy and Infectious Diseases, the National Heart, Lung and Blood Institute, and the National Cancer Institute.

The conference Gottlieb was attending had been called at the NCI to explore the outbreak of a rash of cases of a rare skin cancer in the United States, known as Kaposi’s sarcoma.1 One of the other doctors at the conference was Linda Laubenstein, a blood specialist at New York University. She had first seen KS in a patient of hers in September 1979, when it had presented as a generalised rash on a man’s skin, associated with enlarged lymph nodes. At that point she had never heard of KS, and had looked it up after the cancer had been diagnosed by a dermatologist. This particular type of illness had originally been discovered in 1871 among Mediterranean and Jewish men, and in the century that followed, between 500 and 800 cases had been reported. It was also seen among the Bantu in Africa. It usually struck men in their forties and fifties and was generally benign; the lesions were painless, and the victims usually died much later of something else. But, as Laubenstein and Gottlieb now knew, KS in America was much more vicious: 120 cases had already been reported, often in association with a rare, parasitical form of pneumonia – Pneumocystis – and in 90 percent of cases the patients were gay men.2 An added, and worrying, complication was that these patients also suffered from a strange deficiency to their immune system – the antibodies in their blood simply refused to fight the different infections that came along, so that the men died from whatever illnesses they contracted while their bodies were already weakened by cancer.

Gottlieb was astonished by the Bethesda conference. He had arrived amid rumours that the NIH was at last going to fund a research program into this new disease. The Center for Disease Control, headquartered in Atlanta, Georgia, had been trying to trace where the outbreak had started and how it had spread, but the CDC was just the ‘shock troops’ in the fight against disease; it was now time for more fundamental research. So Gottlieb sat in quiet amazement as he and the others were lectured on KS and its treatment in Africa, as if there was no awareness at NIH that the disease had arrived in America, and in a far more virulent form than across the Atlantic. He left the meeting bewildered and depressed and returned to Los Angeles to work on a paper he was planning for the New England Journal of Medicine on the links between KS and Pneumocystis carinii that he was observing. But then he found that the Journal was not ‘overly enthusiastic’ about publishing his article, and kept sending it back for one amendment after another. All this prevarication raised in Gottlieb’s mind the feeling that, among the powers-that-be, in the medical world at least, the new outbreak of disease was getting less attention than he thought it deserved, and that this was so because the great preponderance of victims were homosexual.3

It would be another year before this set of symptons acquired a name. First it was GRID, standing for Gay-Related Immune Deficiency, then ACIDS, for Acquired Community Immune Deficiency Syndrome, and finally, in mid-1982, AIDS, for Acquired Immune Deficiency Syndrome. The right name for the disease was the least of the problems. In March of the following year the New York Native, a Manhattan gay newspaper, ran the headline, ‘1,112 AND COUNTING.’ That was the number of homosexual men who had died from the disease.4 But AIDS was significant for two reasons over and above the sheer numbers it cut down, tragic though that was. It was important because it encompassed the two great strands of research that, apart from psychiatric drugs, had dominated medical thinking in the postwar period; and, second, because the people who were cut down by AIDS were, disproportionately, involved in artistic and intellectual life.

The two strands of inquiry dominating medical thought after 1945 were the biochemistry of the immunological system, and the nature of cancer. After the first reports in the early 1950s about the links between smoking and cancer, it was soon observed that there was almost as intimate a link between smoking and heart disease. Coronary thrombosis – heart attack – was found to be much more common in smokers than in nonsmokers, especially among men, and this provoked two approaches in medical research. In heart disease the crucial factor was blood pressure, and this deviated from the norm mainly due to two reasons. Insofar as smoking damaged the lungs, and made them less efficient at absorbing oxygen from the air, each breath sent correspondingly less oxygen into the body’s system, causing the heart to work harder to achieve the same effect. Over time this imposed an added burden on the muscle of the heart, which eventually gave out. In such cases blood pressure was low, but high blood pressure was a problem also, this time because it was found that foods high in animal fats caused deposits of cholesterol to be laid down in the blood vessels, narrowing them and, in extreme cases, blocking them entirely. This also put pressure on the heart, and on the blood vessels themselves, because the same volume of blood was being compressed through less space. In extremes this could damage the muscle of the heart and/or rupture the walls of the blood vessels, including those of the brain, in a cerebral haemorrhage, or stroke. Doctors responded by trying to devise drugs that either raised or lowered blood pressure, in part by ‘thinning’ the blood and, where the heart had been irreparably damaged, replacing it in its entirety.

Before World War II there were in effect no drugs that would lower blood pressure. By 1970 there were no fewer than four families of drugs in wide use, of which the best-known were the so-called beta-blockers. These drugs grew out of a line of research that dated back to the 1930s, in which it had been found that acetylcholine, the transmitter substance that played a part in nerve impulses (see chapter 28, page 501), also exerted an influence on nervous structures that governed the heart and the blood vessels.5 In the nerve pathways that eventually lead to the coronary system, a substance similar to adrenaline was released, and it was this which controlled the action of the heart and blood vessels. So began a search for some way of interfering with – blocking – this action. In 1948 Raymond Ahlquist, at the University of Georgia, found that the nerves involved in this mechanism consisted of two types, which he arbitrarily named alpha and beta, because they responded to different substances. The beta receptors, as Ahlquist called them, stimulated both the rate and the force of the heartbeat, which gave James Black, a British doctor, the idea to see if blocking the action of adrenaline might help reduce activity.6 The first substance he identified, promethalol, was effective but was soon shown to produce tumours in mice and was withdrawn. Its replacement, propranolol had no such drawbacks and became the first of many ‘beta-blockers.’ They were in fact subsequently found to have a wide range of uses: besides lowering blood pressure, they prevented heart irregularities and helped patients survive after a heart attack.7

Heart transplants were a more radical form of intervention for heart disease, but as doctors watched developments in molecular biology, the option grew more attractive as it was realised that, at some point, cloning might become a possibility. The central intellectual problem with transplants, apart from the difficult surgery involved and the ethical problems of obtaining donor organs from newly deceased individuals, was immunological: the organs were in effect foreign bodies introduced into a person’s physiological system, and therefore rejected as an intruder.

The research on immunosuppressants grew out of cancer research, in particular leukaemia, which is a tumour of the lymphocytes, the white blood cells that rapidly reproduce to fight off foreign bodies in disease.8 After the war, and even before the structure of DNA had been identified, its role in reproduction suggested it might have a role in cancer research (cancer itself being the rapid reproduction of malignant cells). Early studies showed that a particular type of purine (such as adenine and guanine) and pyridamines (cytosine and thymine) did affect the growth of cells. In 1951 a substance known as 6-Mercaptopurine (6-MP) was found to cause remission for a time in certain leukemias. The good news never lasted, but the action of 6-MP was potent enough for its role in immunosuppression to be tested. The crucial experiments were carried out in the late 1950s at the New England Medical Center, where Robert Schwartz and William Dameshek decided to try two drugs used for leukaemia – methotrexate and 6-MP – on the immune response of rabbits. As Miles Weatherall tells the story in his history of modern medicine, this major breakthrough turned on chance. Schwartz wrote to Lederle Laboratories for samples of methotrexate, and to Burroughs Wellcome for 6-MP.9 He never heard from Lederle, but Burroughs Wellcome sent him generous amounts of 6-MP. He therefore went ahead with this and found within weeks that it was indeed a powerful suppresser of the immune response. It was subsequently found that methotrexate had no effect on rabbits, so as Schwartz himself remarked, if the response of the two companies had been reversed, this particular avenue of inquiry would have been a dry run, and the great breakthrough would never have happened.10 Dr. Christian Barnard, in South Africa, performed the world’s first heart transplant between humans in December 1967, with the patient surviving for eighteen days; a year later, in Barnard’s second heart-transplant operation, the patient lived for seventy-four days. A nerve transplant followed in Germany in 1970, and by 1978 immunosuppressant drugs were being sold commercially for use in transplant surgery. In 1984, at Loma Linda University Medical Center in California, the heart of a baboon was inserted into a two-week-old girl. She survived for only twenty days, but new prospects of ‘organ farming’ had been opened up. 11

By the time the AIDS epidemic appeared, therefore, a lot was already known about the human body’s immunological system, including a link between immune suppression and cancer. In 1978 Robert Gallo, a research doctor at the National Cancer Institute in Bethesda, discovered a new form of virus, known as a retrovirus, that caused leukaemia.12 He had been looking at viruses because by then it was known that leukaemia in cats – feline leukaemia, a major cause of death in cats – was caused by a virus that knocked out the cats’ immune system. Japanese researchers had studied T cell leukaemia (T cells being the recently discovered white blood cells that are the key components of the immune system) but it was Gallo who identified the human T cell leukaemia virus, or HTLV, a major practical and theoretical breakthrough. Following this approach, in February 1983, Professor Luc Montagnier, at the Pasteur Institute in Paris, announced that he was sure he had discovered a new virus that was cytopathic, meaning it killed certain kinds of cell, including T lymphocytes. It operated like the feline leukaemia virus, which caused cancer but also knocked out the immune system – exactly the way AIDS behaved. Montagnier did not feel that ‘his’ virus was the leukaemia virus – it behaved in somewhat different ways and therefore had different genetic properties. This view strengthened when he heard from a colleague about a certain category of virus, known as lentiviruses, based on the Latin lentus, ‘slow’.13 Lentiviruses lie dormant in cells before bursting into action. That is what seemed to happen with the AIDS virus, unlike the leukaemia virus. Montagnier therefore called the virus LAV, for lymphadenopathy-associated virus, because he had taken it from the lymph nodes of his patients.14

*

Intellectually speaking, there are five strands to current cancer research.15 Viruses are one; the others are the environment, genes, personality (reacting with the environment), and auto-immunology, the idea that the body contains the potential for cancerous growth but is prevented from doing so by the immune system until old age, when the auto-immune system breaks down. There is no question that isolated breakthroughs have been made in cancer research, as with Gallo’s viral discoveries and that of the link between tobacco and cancer, but the bleak truths about the disease were broadcast in 1993 by Harold Varmus, 1989 Nobel Prize winner in physiology and head of NIH in Washington, D.C., and Robert Weinberg of MIT in their book Genes and the Biology of Cancer.16 They concluded that tobacco accounts for 30 percent of all cancer deaths in the United States, diet a further 35 percent, and that no other factor accounts for more than 7 percent. Once the smoking-related tumours are subtracted from the overall figures, however, the incidence and death rates for the great majority of cancers have either remained level or declined.17 Varmus and Weinberg therefore attach relatively little importance to the environment in causing cancer and concentrate instead on its biology – viruses and genes. The latest research shows that there are proto-oncogenes, mutations formed by viruses that bring about abnormal growth, and tumour-suppresser genes that, when missing, fail to prevent abnormal growth. Though this may represent an intellectual triumph of sorts, even Varmus and Weinberg admit that this has not yet been translated into effective treatment. In fact, ‘incidence and mortality have changed very little in the past few decades.’18 This failure has become an intellectual issue in itself – the tendency for government and cancer institutes to say that cancer can be cured (which is true, up to a point), versus the independent voice of the medical journals, which underline from time to time that, with a few exceptions, incidence and survival rates have not changed, or that most of the improvements occurred years ago (also true).

Such debate, bitter at times, has often made cancer seem far more dreadful than other diseases, and it was this that provoked Susan Sontag, herself recovering from cancer, to write the first of two celebrated essays on illness. Her chief argument in Illness as Metaphor (1978) is that disease in general and cancer in particular in the late twentieth century is used as a metaphor for all sorts of political, military, and other processes, which demonise the illness and, more to the point, separate the sufferer from her/his family, friends, and life.19 In many combative passages, she compares cancer now to TB a few generations ago. Illness, she says, ‘is the night-side of life, a more onerous citizenship.’20 There is, or is supposed to be, something uniquely frightening about cancer, so that even today, in France and Italy, it is still the rule for doctors to communicate a cancer diagnosis to the patient’s family, not the patient. Since getting cancer jeopardises one’s love life, chances of promotion, or even a job, people learn to be secretive. In literature, she points out, TB represents disintegration – it is ‘a disease of liquids’ – whereas cancer symbolises degeneration, ‘the body tissue turning to something hard … a demonic pregnancy.’21 TB affects the lungs, the ‘spiritual’ part of the body, whereas ‘cancer is notorious for attacking parts of the body (colon, bladder, rectum, breast, cervix, prostate, testicles) that are embarrassing to acknowledge.’ Having a tumour generally arouses some feelings of shame, but ‘in the hierarchy of the body’s organs, lung cancer is felt to be less shameful than rectal cancer.’22 The most striking similarity between TB and cancer, she says, is that both are diseases of passion – TB a sign of inward burning, romantic agony, whereas cancer ‘is now imagined to be the wages of repression.’ Surveying a wide range of literature from Wings of the Dove to The Immoralist to The Magic Mountain to Long Day’s Journey into Night to Death in Venice, she finds the transformation of TB, a dreadful disease, into something romantic as ‘preposterous,’ a distortion as she sees it, and one that she does not want repeated with cancer.

Illness as Metaphor, provoked by Susan Sontag’s own experience, was described by Newsweek as ‘one of the most liberating books of our time.’ In AIDS and its Metaphors, published a decade later, in 1989, Sontag returned to the attack.23 AIDS she saw as one of the most ‘meaning-laden’ of diseases, and her aim was to ‘retire’ some of the many metaphors it had acquired. Sontag wanted – fiercely – to combat the aspect of punishment that was attaching to AIDS, to challenge ‘a predictable mix of superstition and resignation [that] is leading some people with AIDS to refuse antiviral chemotherapy.’24 She reserved special venom for those like the Christian right who argued that AIDS was a retribution for the sins and indulgences, the ‘moral laxity and turpitude,’ of the 1960s, and for those above all who were homosexual, understood as in some way abnormal. This Kulturkampf, she said, went beyond America. In France, where she lived part of the time, one right-wing politician had dismissed certain opponents as sidatique (‘AIDS-ish’), or as suffering from ‘mental AIDS.’ But, she asked, could not AIDS better be understood as a capitalist-type disease of the consumer society in which ‘appetite is supposed to be immoderate…. Given the imperatives about consumption and the virtually unquestioned value attached to the expression of self, how could sexuality not have come to be, for some, a consumer option: an exercise of liberty, of increased mobility, of the pushing back of limits? Hardly an invention of the male homosexual subculture, recreational, risk-free sexuality is an inevitable reinvention of the culture of capitalism.’25 She thought that the metaphors of AIDS have diminished us all. They have, for example, helped introduce the sad form of relationship that appeared in the late 1980s – telephone sex, which had the merit, if that is the word, of being safe. We have been further diminished by the widespread campaigns for the use of condoms and clean needles, which, she said, are ‘felt to be tantamount to condoning and abetting illicit sex, illegal chemicals.’26 It was time to understand illness, cancer, and AIDS for what they are: diseases of the body, with no moral or social or literary layers of meaning.

Other factors helped account for the changing perception of AIDS. Also relevant was the nature and quality of the victims themselves. When the Hollywood Reporter ran an item of news in its issue of 23 July 1985 saying that the handsome film actor Rock Hudson was suffering from AIDS, the illness finally received the publicity that in truth, given its killing power, it deserved.27 But in addition to being the first AIDS victim most people had heard of, Hudson was also significant in being an actor. Over the following years, the arts and the humanities lost hundreds of bright lights as, despite the isolation of the virus responsible, AIDS took its toll: Michel Foucault, philosopher, June 1984, aged fifty-seven; Erik Bruhn, ballet dancer, 1986, aged fifty-eight; Bruce Chatwin, travel writer, January 1989, aged forty-eight; Robert Mapplethorpe, photographer, March 1989, aged forty-two; Keith Haring, graffiti artist, February 1990, aged thirty-one; Halston, fashion designer, March 1990, aged fifty-seven; Tony Richardson, film director, November 1991, aged sixty-three; Anthony Perkins, actor, September 1992, aged sixty; Denholm Elliott, actor, October 1992, aged seventy; Rudolf Nureyev, the most famous dancer of his day, who had defected from Russia in 1961, who had been director of the Paris Opera Ballet and danced for every leading company in the world, in January 1993, aged fifty-four. No disease this century has produced such carnage in the intellectual and artistic field.28

Carnage of a different kind took place in the psychiatric ward. On 29 March 1983 Dr John Rosen surrendered his medical licence in Harrisburg, Pennsylvania. He did this in order to avoid being tried by the State Board of Medical Education and Licensure of the Department of State of Pennsylvania, which was preparing to accuse him of sixty-seven violations of the Pennsylvania Medical Practices Act and thirty-five violations of the rules and regulations of the Medical Board.29 Some of the abuses Rosen had subjected his patients to were horrific, none more so than in the case of Janet Katkow, who was taken to see him by her parents (the following details are taken from court documents, part of the public record). On their very first meeting, in front of her parents, Rosen asked Katkow if she had enjoyed her first sexual experience. She did not reply. When she expressed the wish to return to her home in the mountains of Colorado, he immediately made a ‘deep interpretation’ and explained that the snow-capped mountains were ‘the next best thing’ to ‘a breast filled with mother’s milk.’ ‘Defendant then told Plaintiffs mother that he had something better for Plaintiff to suck on and he simultaneously patted his groin with one hand.’30 For the next seven years, Rosen forced Katkow to suck his penis during therapy. These sessions were invariably followed by vomiting on her part, which, he explained, was her throwing up her mother’s milk. Another patient of Rosen’s, Claudia Ehrman, who was treated by two of his assistants, was found dead in her room on 26 December 1979, having been heavily beaten, it emerged, by the assistants as part of therapy, in ‘an attempt to force her to speak to them.’

An account of Dr. Rosen’s extraordinary theories and practices, known in the psychiatric profession since 1959 as ‘direct analysis,’ and which culminated in the 102 charges against him being dropped, in exchange for his licence, forms the central chapter of Jeffrey Masson’s book Against Therapy, published in 1988. Masson had himself trained as a psychoanalyst and was briefly projects director of the Sigmund Freud Archives, but he came to the conclusion that there was something very wrong with psychotherapy, whatever its genealogy. Masson’s was an attack on psychoanalysis from a direction not seen before – that it was by definition corrupt and thereby irreconcilably flawed.

Masson began his book by going back to Freud himself and reexamining the very first patient, Dora. Masson’s argument was that Freud had his own problems that he brought to the sessions with Dora, that they interfered with his interpretation of her condition, that she understood him every bit as well as he understood her, and that Freud ‘simply ignored her needs in the service of his own, which was to find more evidence for the correctness of his psychological theories.’31 In other words, psychoanalysis was flawed from the very beginning. From there, Masson moved forward, examining Sandor Ferenczi’s secret diary (not published until 1985, although he had died in 1933), which showed that he too had had his doubts about the therapeutic relationship, to the point of even considering a variant, namely ‘mutual analysis,’ in which the patient analyses the therapist at the same time that the therapist is analysing the patient. He also looked at Jung’s involvement with the Nazis, his anti-Semirism, and his mysticism, once again finding that Jung, like Freud, was an authoritarian, reading his own thoughts into whatever stories his patients told him, on the basis that the therapist is healthy, devoid of neuroses, and the patient is in this sense unclean. Masson also looked at the newer therapies, of Carl Rogers, for example, at Fritz Perls’s Gestalt therapy, and the work of Rollo May, Abraham Maslow, and Milton Erickson.32 Everywhere he found a great deal of authoritarianism and, more perniciously, a great concern with sex, especially sex within the therapeutic relationship. For Masson, it was clear that with many therapists the therapeutic situation served their needs as much as, or more than, the needs of the so-called patients, and for this reason he thought that therapy per se was impossible, that this is why the figures showing the inefficacy of psychoanalysis had to be right.

Much wittier than Masson’s attack was Ernest Gellner’s in The Psychoanalytic Movement (1985), which must rank as one of the greatest intellectual putdowns of the century.33 Gellner, born in Paris in 1925 and educated in Prague and England, became professor of both philosophy and sociology at the London School of Economics, then William Wye Professor of Social Anthropology at Cambridge. The subtitle of his book was ‘The Cunning of Unreason,’ and nothing in psychoanalysis – no non sequitur, no inconsistency, no piece of sloppy reasoning or logical laxity, no hypocrisy – was allowed to escape. His chief target is the unconscious, which he says is the new version of original sin.34 Its official principle, he says, in only one of many wonderful belittlings, is ‘Softlee Softlee Catchee Unconscious.’ It is as if, he says, there is an Unconscious Secrets Act; the unconscious is not merely hidden from consciousness but seeks actively to remain so.35 ‘Neither intelligence nor conscious honesty nor theoretical learning in any way increase the prospects of by-passing and surmounting the counter-intelligence ploys of the Unconscious.’36 By some strange set of events, however, Freud was able to break down this seemingly impregnable barrier and passed on the secret to others in a secular Apostolic Succession. But, asks Gellner, if the unconscious is so clever, why didn’t it see Freud coming and disguise itself even more thoroughly? Gellner’s aim was not simply to return to the statistical arguments against cure by psychoanalysis, however, but to debunk it. He quoted the Nobel Prize winner Friedrich von Hayek: ‘I believe men will look back on our age as an age of superstition, chiefly connected with the names of Karl Marx and Sigmund Freud.’37 Yet Gellner really had no need of help from others. ‘The Unconscious,’ he wrote, ‘is like some low hostelry just across the border, where all the thieves and smugglers indulge themselves with abandon, free of the need for camouflage and disguise which they prudently adopt, for fear of the authorities, when they are this side of the frontier … [The unconscious] is like meeting all one’s friends, enemies and acquaintances, but at the carnival and in fancy dress: one may be a bit surprised at what they get up to but there are few … surprises as to personnel.’38

Freud was not alone in being debunked. At the end of January in 1983 the New York Times ran a front-page story headed: ‘NEW SAMOA BOOK CHALLENGES MARGARET MEAD’S CONCLUSIONS.’ The book was the work of the New Zealand-born Australian anthropologist Derek Freeman, who had been working in Samoa since 1940, mostly in an area some 130 miles from Ta’u, the village where Mead had done her fieldwork. His conclusion was that Mead had completely misunderstood Samoan society, and by implication drawn the wrong conclusions. The Samoans, said Freeman, were just as troublesome as people anywhere, and they had ‘resented the way they were portrayed in Coming of Age,’ as simple, playful people for whom sex was largely a game and whose nature was very different from that of people in other cultures.39

The New York Times story ran for 47 column inches, occupying almost a page inside the paper, and ignited a furious debate. Harvard University Press brought forward publication of Freeman’s book Margaret Mead and Samoa: The Making and Unmaking of an Anthropological Myth, and he was invited on to television programs all over America. Several scientific seminars were held to consider his findings, the most important of which was a meeting of the American Anthropological Association.40 Here Freeman’s motivation was called into question. It was noted that he had hitherto been an obscure academic, working in Samoa by his own admission since 1940. Could he not have presented his arguments before, while Mead was alive to defend herself? He replied that he had put his preliminary doubts to her, and she had acknowledged certain shortcomings in her data, but it was not until 1981, when he was granted permission to examine Samoan court records, that he could conclude that Western Samoa was just as violent a society as elsewhere.41 Other anthropologists doubted Freeman’s account at this point; they had had no problem getting access to court records many years before. A bigger issue, however, was where Freeman’s revelations, if revelations they were, left Franz Boas’s ideas that culture, and not nature, is more important in determining behaviour patterns. Freeman was not himself a biological determinist, but there is no question that if he was right, his revision of Mead’s findings provided support for a less ‘cultural’ understanding of human nature. The issue was never satisfactorily resolved, but Mead, like Freud, now has a definite shallow over her seminal work (no one doubts that many of her other findings were real).

In 1997 Roy Porter published The Greatest Benefit to Mankind: A medical history of humanity from antiquity to the present. In his chapter on clinical science, Porter quotes Sir David Weatherall, Regius Professor of Medicine at Oxford. Weatherall, as Porter reports, asked of modern medicine this question: How are we doing? and reached a surprisingly sombre conclusion. ‘We seem to have reached an impasse in our understanding of the major killers of Western society, particularly heart and vascular disease, cancer and the chronic illnesses whose victims fill our hospitals…. Although we have learned more and more about the minutiae of how these diseases make patients sick, we have made little headway in determining why they arise in the first place.’42

Weatherall’s scepticism is realistic; his argument is well made. Triumphalism in science is unscientific. The same goes for the revisions of Freud, Jung, and Mead. The irony – and absurdity – of having a therapeutic sensibility when the therapies don’t work cannot escape anyone. Porter’s own conclusion, after his masterly survey of medicine, was hardly less pessimistic than Weatherall’s: ‘The root of the trouble is structural. It is endemic to a system in which an expanding medical establishment, faced with a healthier population, is driven to medicalising normal events like menopause, converting risks into diseases, and treating trivial complaints with fancy procedures. Doctors and “consumers” are becoming locked within a fantasy that everyone has something wrong with them, everyone and everything can be cured.’43 This is one explanation, of course, for why the ‘cure rates’ for psychoanalysis are so dismal. Many who seek analysis have nothing wrong with them in the first place.