6 The Write Stuff
Healing through Language
Throughout their lives, people experience traumatic events that can have long-term negative consequences for them. The quintessential example is post-traumatic stress disorder (PTSD), in which people experience unwanted and distressing feelings and thoughts related to an earlier traumatic event. These feelings and thoughts can persist for years and can be debilitating in some cases. Another, less well-known consequence of trauma is complicated grief disorder (CGD). This most typically occurs as the result of bereavement. Although the death of a loved one will have a significant negative impact for most people, people who suffer from CGD experience high levels of distress long after their loss.
A fundamental assumption of psychotherapy is that talking about distressing experiences can lead to improvements in well-being. However, psychotherapy can be both expensive and time-consuming. Could simply writing about traumatic episodes be beneficial as well? The University of Texas psychologist James Pennebaker and his colleagues sought to answer this question. Some of the undergraduate participants in their study were asked to write about the most upsetting or traumatic event they had ever experienced and to express their deepest feelings and thoughts about the trauma. Participants in a control group were asked to simply write about an assigned topic, such as their plans for the rest of the day. Both groups wrote for just twenty minutes on four consecutive days. Perhaps not surprisingly, participants who had written about traumatic events, such as the loneliness they experienced at college or conflicts with romantic partners, reported higher levels of subjective distress immediately after these writing sessions. Remarkably, they reported being happier than the control group at a three-month follow-up. In addition, Pennebaker’s team found that students who had written about their trauma had fewer illnesses and visits to the university’s student health center in the weeks after the writing sessions. The researchers even observed physiological changes: blood tests revealed improvements in two measures of cellular immune function.1
In fact, additional research has linked a host of positive outcomes to writing or talking about traumatic experiences. Several other studies have also found reductions in doctor visits after writing about trauma. In addition, scientists have documented improvements in several physiological health markers besides cellular immune function. And several other behavioral changes, such as improvement in student grade point averages, finding a new position after a job loss, and reduction in work absenteeism, have been reported.2 Writing about emotionally traumatic experiences can even lead to improvements in people who suffer from rheumatoid arthritis and asthma.3
Of course, one could argue that undergraduate homesickness or adolescent relationship problems are only pale shadows of the kind of trauma that can lead to PTSD and CGD. However, Pennebaker also conducted interviews with Holocaust survivors living in the Dallas area in the 1980s. He found that the majority of these survivors had not talked about their horrific experiences with anyone else, either because they wanted to forget or because they thought no one else would understand. Pennebaker and his colleagues conducted one- to two-hour, highly emotional interviews with more than sixty survivors. The sessions were videotaped, and each interviewee’s heart rate and skin conductance were also measured. (Skin conductance is a measure of how much someone is perspiring. It is a commonly used measure of physiological arousal, which is often associated with strong affective states.) Based on these physical measures and the degree of traumatic content of the interviews, the researchers classified each survivor as being a low, midlevel, or high discloser. High disclosers remained physiologically relaxed as they described their suffering, whereas low disclosers displayed signs of tension during their interviews. When they contacted the participants about fourteen months after the interviews, the researchers found that both high and midlevel disclosers were healthier than they had been before participating in the interviews. The low disclosers, however, were more likely to have seen their doctors about an illness in the year after the interviews.4
Other research by Pennebaker and his colleagues suggests that specific types of words may be responsible for the beneficial effects of writing about traumatic experiences. Their work suggests that improved mental and physical health is seen when participants employ positive emotion words (such as “happy,” “joyful,” and “elegant”) relative to negative emotion words (like “angry,” “wrong,” and “sad”). Using words from other linguistic categories had positive but less universal effects. Words associated with insight (like “see,” “understand,” and “realize”) and terms associated with causation (like “infer,” “thus,” and “because”) were correlated with better physical, but not psychological, health.5
An important qualification to this line of research is necessary. Talking or writing about trauma proves most beneficial when people use language to create a narrative structure about their experiences. In other words, one must think about the traumatic event as a meaningful story, not just a pattern of unconnected experiences and sensations. When college students were asked to write about traumatic events in their lives as coherent narratives, they subsequently reported fewer bouts of illness than a group who described their trauma in the form of a fragmented list.6 Others have also echoed the importance of constructing a coherent self-narrative.7
It would be a mistake, however, to regard writing about traumatic events as some sort of magic cure-all. Pennebaker himself points out that writing about trauma seems to confer no benefit if it occurs right after the event itself. And writing a lot about negative events, perhaps via extensive journaling, may actually be harmful: it could turn into rumination, which might worsen anxiety or depressive symptoms.8
Reminiscing
Again, you can’t connect the dots looking forward; you can only connect them looking backward. So you have to trust that the dots will somehow connect in your future. You have to trust in something—your gut, destiny, life, karma, whatever.
—Steve Jobs (2005)
The founder of Apple famously spoke these words during a commencement address at Stanford.9 He was speaking to young people on the verge of a new life. But at the opposite end of life’s journey, it’s important to think about something Jobs didn’t say: in looking backward, the dots won’t connect themselves.

As we saw in the previous section, talking or writing about traumatic events has been shown to have positive health benefits. But there’s a catch—two actually: it’s best if some time has passed since the event, and the events need to be conveyed in the form of a coherent, meaningful narrative.
Erik Erikson speculated that one of the natural consequences of coming to terms with the approaching end of life is the desire to look back and make sense of it all.10 Erikson divided our life span into eight stages of development that begin at birth and continue through the end of life. He theorized that during each of these stages an individual is faced with a crisis. The theme of the crisis is tied to the most salient aspect of identity formation at that time. As we age, the successful resolution of one stage prepares us to cope with the upcoming crisis of the next. For Erikson, identity development is a lifelong process in which “different capacities use different opportunities to become full-grown components of the ever-new configuration that is the growing personality.”11
The first six crises, from infancy through one’s thirties, center on struggles related to trust, autonomy, initiative, competence, identity, and the experience of intimacy. Regardless of how these crises are resolved, adults in their forties, fifties, and early sixties next turn to figuring out how to give back to others and nurture future generations. Erikson conceptualized this struggle as one between generativity and stagnation. Although often expressed through child rearing and the guidance given by parents and grandparents, the goals of this stage can also be viewed more broadly. At this point, men and women start to think about their legacy and to reevaluate the direction their life is taking. Some people may recommit to previously established goals and find new strength and energy in seeing these goals through. Others may make a change and take their lives in a new direction.
For most people, it would be a dramatic overstatement to call this evaluation a midlife crisis. But certainly such a reexamination is likely to be taking place in middle age. A failure to use generativity as a way to model positive values sets an example of greed and selfishness for the next generation.12 And language is key to the realization of generativity. Mentoring, guiding, parenting, and teaching are just some of the ways people use language to give back to others.
After an individual has figured out how to be generative, he begins to think more about his own mortality. This new stage generally begins when a person retires, around age sixty-five. An individual who sees his life as whole and complete can make the most of the time he has left and face death with a sense of calm. A person who can only look back and feel that he hasn’t accomplished anything worthwhile becomes depressed and despairs of his eventual death. Based on Erikson’s ideas about this crisis between integrity and despair, Robert Butler created life review therapy13 to help adults reflect in a way that brings about reconciliation, peace, and a sense of life satisfaction14 and empowerment.15
Life review therapy is not the only technique for discovering the integrity of one’s life. For example, Guided Autobiography (GAB) is a structured life review process that occurs in a group setting.16 In GAB, people write down memories of events that relate to specific themes. They then share these memories with others in a small group. Through this kind of reminiscence, we can develop “a deeper understanding of who we are, where we have been, and where we are going in the future.”17
But not all searches of lost time are effective remembrances of things past. To separate adaptive kinds of reminiscence from maladaptive ones, Paul Wong and Lisa Watt created a taxonomy of reminiscence.18 Theirs is not the only way to specify types of reminiscence,19 but for our purposes, their categories illustrate some important ways in which the language used to describe the past has an impact on the present.
Of the six types of reminiscence that they discuss, the category called integrative reminiscence is the one most closely aligned with the goals of Erikson, life review therapy, and GAB. Integrative reminiscence involves coming to terms with failures and disappointments to understand the overarching trajectory of one’s life. “To the extent that one is able to resolve these negative sentiments from the past, life review should contribute to successful aging.”20 Integrative reminiscence also allows people to understand how they have lived their life according to their own system of values. It is empowering to see how one’s life has coherence, meaning, and purpose, even though things did not always work out as planned.
A second type of helpful reminiscence is instrumental reminiscence, which allows people to draw on past experiences to overcome present circumstances. Remembering a particularly difficult time and then thinking about what it took to overcome that difficulty can build resilience. Therefore, practicing instrumental reminiscence as a coping strategy is a skill worth developing much earlier than one’s sixties.
Unlike instrumental reminiscence, which has a practical goal, transmissive reminiscence occurs when an older person tells a younger person about her life and the lessons she learned from it. This kind of reminiscence is meant to instruct and guide. In Erikson’s terms, transmissive reminiscence is generative. Mentors often use this kind of reminiscence to help their mentees. A great deal of satisfaction can come from knowing that one’s past experiences will help others in the future.
Narrative reminiscence also occurs when someone tells stories about past events. But here there is no attempt to integrate the events, solve a problem, or instruct. Even so, this kind of reminiscing is not unhealthy. Children love to hear the stories of their parents and grandparents almost as much as they enjoy telling them. These kinds of stories can be entertaining or can allow people to bond over a shared history. Nevertheless, while not harmful, the mere narration of past events does not bring about a sense of closure or resolution.
When reminiscing takes the form of glorifying the past as a way of denigrating the present, however, it devolves into escapist reminiscing. There may be times when escaping into the past provides a sense of relief from the burdens of the present. But too much escape may keep a person from coping effectively (think Norma Desmond in Billy Wilder’s Sunset Boulevard).
Even worse, rumination on negative past events without any attempt to make sense of them could be seen as the definition of despair. This kind of obsessive reminiscence merely recycles the memories that keep a person feeling guilty and powerless. They are ultimately “destructive, if left unresolved.”21
Effective life review techniques teach individuals how to put past experiences into language that facilitates meaningful integration. They help people come to terms with unresolved issues and reconnect with the past. But life review is not necessarily the end of the story.
In the 1990s, Erikson’s wife Joan added one more stage to the theory after Erik had died. Called simply the Ninth Stage, this is a time when each of the crises of the previous eight stages is reexperienced all at once. People in their eighties and nineties may no longer trust their own abilities: they may feel a loss of the autonomy, sense of purpose, and competence it took a lifetime to develop. Experiencing such reversals could lead to a further loss of identity and a sense of isolation, uselessness, and despair. The crisis of the Ninth Stage, therefore, is existential in nature. Fittingly, successfully resolving this stage involves moving beyond the physical and into the realm of transcendence.22
Late Bloomers
Our united opinion is entirely against the book. It is very long, and rather old-fashioned.… Does it have to be about a whale?
—from a publisher’s rejection of Herman Melville’s Moby-Dick
Keep scribbling! Something will happen.
—attributed to Frank McCourt (1930–2009)
The literary world is replete with authors who achieved fame early in life by writing critically acclaimed debut novels. Bret Easton Ellis, for example, was just twenty-one when Less Than Zero was published in 1985, and F. Scott Fitzgerald was twenty-three when This Side of Paradise was released in 1920. Other novelists who made their mark at a young age include Truman Capote (Other Voices, Other Rooms: age 23), Zadie Smith (White Teeth: 24), and Norman Mailer (The Naked and the Dead: 25). The author as prodigy is not a modern phenomenon, either: a century before Fitzgerald, Mary Shelley was just twenty when Frankenstein appeared in 1818; and a century before that, the twenty-four-year-old Alexander Pope made a splash with The Rape of the Lock, which appeared in its first version in 1712.
Based on examples like these, one could argue that literary ability is simply something that a person is born with. Such a narrative is reinforced by wunderkinds like Stephen Crane, who was writing stories at fourteen and self-published Maggie: A Girl of the Streets at twenty-two. Perhaps even more impressive, just two years later, he wrote The Red Badge of Courage, in which he convincingly described a Civil War battle, despite never having served in the military. And despite being deaf and blind from infancy, Helen Keller wrote her autobiography, The Story of My Life, when she was twenty-two. She would go on to write eleven more books over her lifetime.
The accomplishments of young authors like these are impressive, but is such precocity typical? In 2010 the Humber School of Creative and Performing Arts in Toronto undertook a survey to determine the average age of authors when they published their first book. The survey excluded academicians and included only prose writers who had published their first work with a traditional publisher (as opposed to self- or online publishing). The surveys were distributed to 1,500 authors, and 475 responded. When the numbers were crunched, they showed that the average age of first-time published authors was forty-two.23 In other words, first-time authors who secured a book contract had probably been writing, either full- or part-time, for twenty years or more before they were published. This suggests that successful writers do not need to be kissed by the Muse at birth. It’s also in line with current research on the development of expertise, which suggests that success is not a matter of innate talent but rather the result of years of deliberate practice.24
In fact, we can cite many examples of writerly success at the other end of the age spectrum. The phrase “late bloomer” is often applied to such antiprodigies and reflects a cultural expectation that notable accomplishment is more common early in one’s career than later. According to the Oxford English Dictionary, the term was originally applied to plants, like the black-eyed Susan, that produce flowers relatively late in the summer or early fall. As early as 1921, the English psychologist Charles Spearman used the phrase metaphorically to refer to the intellectual abilities of children. And late blooming has always been the secret hope of plenty of parents with regard to seemingly directionless offspring.
Many authors, however, take to writing later in life, after pursuing a variety of other careers. And some only come into their own after producing a series of less successful works. Bram Stoker began writing as a theater critic and authored short stories and nonfiction while working as a civil servant in Dublin. In his forties, he published several unsuccessful novels, with titles like The Snake’s Pass (1890) and The Shoulders of Shasta (1895). Dracula, the gothic horror novel that made him famous, did not appear until 1897, the year he turned fifty. Anna Sewell wrote her first and only novel, Black Beauty, during her fifties, and it was published in 1877, just a few months before she died at the age of fifty-eight.
Other impressive examples of late-blooming authors include the British civil servant Richard Adams, who told his children a story about a group of talking rabbits during a car trip. His kids encouraged him to commit it to paper, and the result, Watership Down, was published in 1972, when Adams was fifty-two. The Irish American writer Frank McCourt had a career as a teacher before his wife encouraged him to write down the stories that he told about his impoverished childhood in Limerick. This led to Angela’s Ashes, published when McCourt was sixty-six. The memoir won a Pulitzer Prize for its author in 1997.
Many of the best-selling authors in history persevered despite receiving multiple rejections. Watership Down, which would become one of the fastest-selling books in history, was rejected by four publishers and three writer’s agencies before it was finally accepted. J. K. Rowling’s agent received twelve rejections from publishers unwilling to take a chance on a book about a boy wizard named Harry Potter. Both Stephen King and John Grisham had their debut novels rejected by dozens of publishers and literary agents. Zane Grey, the frustrated dentist who took up writing, was reportedly told at one point, “You have no business being a writer and should give up.” Another writer of Westerns, Louis L’Amour, received two hundred rejections before finding a publisher. Both Grey and L’Amour went on to write dozens of novels about the Old West, and both enjoyed sales of hundreds of millions of copies. Agatha Christie endured five years of rejections before she experienced her first success and eventually became the best-selling novelist of all time. But for sheer determination, it’s hard to top Jack London. His estate in San Francisco has preserved many of the six hundred rejections that came his way before he published his first story.25
Finally, some authors have been late blooming and also extremely prolific. A good example would be the British author Ted Allbeury (1917–2005). His life reads like a novel, or perhaps several of them. He was an intelligence officer and parachuted into Nazi Germany during World War II (he is thought to have been the only British secret agent to do so). During the Cold War, he ran agents across the border between East and West Germany and was once captured and tortured by the KGB. After coming in from the cold, he ran his own advertising agency and later became the managing director of a pirate radio station.26 So when he turned to writing, in his early fifties, he had plenty of material to draw on. Although he is not as well known as some of his contemporaries, like John le Carré or Jack Higgins, for many years Allbeury was one of Britain’s most popular espionage writers. His first novel, A Choice of Enemies, was published in 1972, and over the next twenty-one years, he penned forty-one novels, many under his own name, and others under two pseudonyms. At his most prolific, during the early 1980s, he published as many as four novels a year and continued to write until his early eighties. His final work appeared in 2000. It seems fair to say that Allbeury, had he begun writing in his early twenties, could not possibly have been so prolific within his chosen genre. Because he started writing relatively late in life, he was able to bring his wealth of experiences to his fictional creations. Being prolific is certainly an important part of achieving literary success. But it also helps if the author has something to say.
Writer’s Block
Writing is easy. All you do is stare at a blank sheet of paper until drops of blood form on your forehead.
—attributed to Gene Fowler (1890–1960)
As Fowler pointed out, many authors struggle mightily to put their thoughts into words. The inability to write easily or consistently is usually referred to as “writer’s block,” a term coined by the Austrian psychiatrist Edmund Bergler in 1947. Bergler’s Freudian interpretation that the problem was rooted in “oral masochism and a superego need for punishment” has not fared well in the marketplace of ideas.27 The phrase itself, however, had become firmly entrenched in the popular imagination by the 1970s. And any number of self-help books are available that purport to “conquer,” “cure,” or “break through” writer’s block.
The term may be well known, but it is also notably imprecise, as writers can get hung up at many different points in the writing process. In some cases, a blockage of words or ideas really does seem to occur. However, an inability to write can have many other causes, such as procrastination, perfectionism, fear of criticism, and garden-variety melancholy. In other cases, a writer may have too many ideas and find himself unable to choose among them. As a result, some researchers who study this condition employ the term “writing anxiety” instead. No one is immune, it seems: in a study of successful academic authors, the word most commonly employed to describe the writing process was “frustration.”28 And authors who claim to suffer from this malady can still be impressively prolific: even Fowler, quoted earlier, managed to produce dozens of novels, biographies, and Hollywood screenplays during his career.

Novelists who meet with initial success may be especially prone to becoming blocked. This is an example of a more general phenomenon, sometimes referred to as the “sophomore slump,” which can also affect initially promising college students (hence the name), as well as athletes and musicians. After a promising debut, many authors fear that their second creation will not measure up to the first, and this anxiety may lead to a loss of confidence and a delay in creating a follow-up work. In fact, such anxiety is warranted, since probability theory predicts that unusual performances (in this case, a much-lauded premiere) tend to be followed by less stellar showings.
In some cases, the blockage can become permanent. The list of authors who have suffered this fate is both long and varied. Samuel Taylor Coleridge produced his well-known serious poems at the beginning of his career, and although he continued to write as a journalist and literary critic, was never able to recapture the genius of his early years.29 More recent examples include Ralph Ellison and Harper Lee, who never published another novel after their first successful works (Invisible Man and To Kill a Mockingbird, respectively, although both authors left behind manuscripts that were published posthumously). During the final two decades of his life, Truman Capote was unable to produce another novel to follow In Cold Blood.30 And perhaps the most extreme example is Joseph Mitchell, the celebrated contributor to the New Yorker. He published Joe Gould’s Secret in 1965, when he was fifty-six. For the remaining thirty years of his life, he went to his office every day but never again produced anything of significance.31
It may be helpful to consider the solutions to blocks that professional authors have devised. After writing more than a dozen books, the British novelist Graham Greene became blocked in his fifties. He found that keeping a dream journal helped, since it was writing that was not meant to be read by anyone but him.32 After a fallow period, Greene went on to write another dozen books, including some of his best-known work. His last novel was published in his early eighties.
The American neurologist Alice Weaver Flaherty has suggested that we can better understand writer’s block by thinking of it as a brain state, and she contrasts it with a less well-known syndrome called hypergraphia, or the overwhelming urge to write. Flaherty suggests that both conditions can be triggered by underlying clinical syndromes, such as mania and epilepsy. With regard to writer’s block, she suggests that the ultimate culprit may be decreased activity in the frontal lobes of the brain, which may be offset by medications that are used to treat depression and anxiety.33 But as everyone knows, cause and effect are difficult to disentangle. It may be that writer’s block is caused by anxiety and depression, but it is also likely that the state of being blocked can lead to feelings of anxiety and depression.
We should note that not all writers believe in writer’s block. Authors such as Allan Gurganus and Mark Helprin have pointed out that such blockages do not occur in other professions. They disparage this idea by suggesting that it may simply be an excuse for sheer laziness. After all, plumbers and electricians don’t suddenly lose the ability to repair pipes and run wires.34
Gathering data on writer’s block turns out to be rather difficult. Many writers are reluctant to discuss their creative practices with researchers.35 And some fear that the simple admission of having writer’s block could turn a temporary slowdown into a more serious problem—a kind of self-fulfilling prophecy.36 As a result, studies of writer’s block have often involved college students: they represent a convenience sample for researchers, but they must also frequently write against assignment deadlines.
Mike Rose, an American education researcher who was one of the first to study writer’s block in college students, suggests that students are often overly constrained by the inflexible rules they have been taught about how they “ought” to write. He also points to premature editing, problems in planning and writing strategies, and problematic attitudes and assumptions about the writing process itself. Although these conclusions were drawn from a small number of participants and were based on interview data, they provided an important starting point for later investigations.37
College students have also served as participants in studies designed to alleviate writing anxiety. In one study, a treatment group received a combination of cognitive-behavioral therapy aimed at reducing stress and instruction about how to write better. They were compared to a second group who received only the writing instruction. Both treatment groups reported that their anxiety about writing decreased, but only the group that received both therapy and instruction showed an improvement in writing quality.38 The results of this study suggest that anxiety plays a major role in writer’s block.
Does the incidence of writer’s block increase or decrease over the adult life span? It’s difficult to say, since, as we have seen, the term can be used to refer to many different types of writing difficulties. A selection bias may also be involved: individuals who are frequently blocked may, over time, gravitate to professions that do not require extensive amounts of writing. Finally, older writers may confront changes in motivation and energy levels. An author may retire from writing because she is blocked, but the reason may also be that she simply lacks the stamina for the sustained cognitive effort that writing requires. When Philip Roth was asked if he ever missed writing, he replied, “I was by this time [about age 77] no longer in possession of the mental vitality or the verbal energy or the physical fitness needed to mount and sustain a large creative attack of any duration on a complex structure as demanding as a novel.… Not everyone can be fruitful forever.”39
The Destroyer of Minds
Consider the following passage from a late-twentieth-century novel:
Benet had taken a taxi from his house to Anna’s house near Sloane Square. From here he had taken a taxi to Owen’s house. Now he was taking another taxi to Rosalind’s little flat off Victoria Street. He rarely drove his car in London. As he sat in the taxi he felt a pang of painful miserable guilt.
Although this excerpt is just fifty-eight words and five sentences long, it is full of repetition. The phrase “taken a taxi” appears twice in the first two sentences, along with “taking another taxi” in the third, and “in the taxi” in the fifth. The word “house” is used three times in the first two sentences. An English teacher would undoubtedly be itching to take a red pen to such writing to remove the repetitions, or at least to suggest alternate word choices.
It might come as a surprise, therefore, to learn that the author of this passage is Iris Murdoch, the acclaimed British novelist and philosopher (1919–1999). She published twenty-six novels over a forty-year career that began in the mid-1950s, and the passage quoted here is from her final fictional work, Jackson’s Dilemma.40 Earlier in her career, Murdoch’s work had won awards, including the coveted Man Booker Prize for Fiction, and she was made a Dame Commander of the British Empire in 1987. Murdoch was, in fact, considered one of the most significant British writers of the postwar period.
When Jackson’s Dilemma was published, however, critical opinion was decidedly mixed. The New York Times, for example, praised the work for being “psychologically rich” but also noted that it was “strewn with imprecisions and blatant redundancies,” with “pet words [scattered] like so many nails in the reader’s road.” In short, the reviewer declared, “the writing is a mess.”41
What the reviewers did not know was that Murdoch was already struggling with cognitive impairment and would be diagnosed with Alzheimer’s disease (AD) in 1997. She died two years later at the age of seventy-nine.42 Her battle with the disease is well known because of her husband’s memoir and the 2001 movie Iris, in which she was played by Kate Winslet and Judi Dench.43
As mentioned previously, a diagnosis of AD cannot be made definitively during one’s lifetime, and in Murdoch’s case, it was confirmed only after her death. Peter Garrard, the neuroscientist who made the initial diagnosis and conducted her autopsy, went on to coauthor a paper that examined her literary output. The researchers compared samples of the prose in Jackson’s Dilemma with samples from her first novel (Under the Net, 1954), as well as the midcareer work that won the Man Booker Prize (The Sea, the Sea, 1978). Although the syntax used in the three books did not vary greatly, her final novel uses a smaller vocabulary, and the words she employs are more common.44
Building on Garrard’s work, a group of researchers at the University of Toronto conducted a computerized analysis of the complete texts of all of Murdoch’s novels and examined a wider variety of linguistic markers. They also found an abrupt decline in the breadth of vocabulary that Murdoch employed in Jackson’s Dilemma when compared to her other novels. In addition, they found increases in phrasal and word repetitions (as seen with “taxi” and “house” in the passage quoted earlier), as well as an increased use of lexical fillers, such as “um” and “ah.” Significantly, some of these changes in her writing had begun to occur when she was in her fifties, more than twenty years before she died.45
Of course, it’s unwise to make generalizations from a single case, and the Toronto researchers also analyzed the works of Agatha Christie (1890–1976), the enormously successful British crime novelist, as well as P. D. James (1920–2014), another British crime writer. The researchers chose Christie because she was suspected of having developed dementia during her final years, although no postmortem confirmation was undertaken. James was included as a control, since she showed no signs of cognitive impairment during her lifetime. Just as with Murdoch, the researchers found that Christie’s vocabulary size suddenly declined in her later novels, and she also greatly increased her use of word repetitions and fillers. P. D. James, on the other hand, displayed none of these trends: none of the lexical or syntactic fluctuations during her career were statistically significant.46
The availability of machine-readable texts and the development of sophisticated linguistic analysis tools have led to something of a cottage industry in this field. Ian Lancashire, part of the Toronto group of researchers, has analyzed the works of Murdoch as well as two other writers who are known to have suffered from dementia: Ross Macdonald (1915–1983), a crime fiction novelist; and Enid Blyton (1897–1968), the incredibly prolific English writer who penned hundreds of children’s books. Lancashire compared these authors to L. Frank Baum, James Hilton, and R. A. Freeman, none of whom showed cognitive impairment during their lives. The nonimpaired authors were similar in that their vocabulary sizes either remained the same or increased during their careers, and their rates of employing phrasal repetitions remained unchanged. Macdonald and Blyton, however, showed the now familiar pattern of increased repetitions over time and, in the case of Macdonald, a shrinking vocabulary size.47
Sadly, many writers become painfully aware that something is very wrong with their mental faculties. Murdoch, for example, talked of feeling like she was falling and being in “a very, very bad, quiet place” during an interview in the year after Jackson’s Dilemma was published.48 And Terry Pratchett (1948–2015), the popular English fantasy writer, made his diagnosis—a type of early onset AD—public in 2007, when he was fifty-nine. He spent the remaining years of his life advocating for more funding for medical research and was able to complete several more novels “through the haze of Alzheimer’s.”49 However, he was unable to complete his autobiography.
Several prominent politicians, including Ronald Reagan and Margaret Thatcher, were diagnosed with dementia, a fact that raises questions about whether their cognitive impairments began while they were in office. Most politicians are not novelists, but they do engage in linguistic performances at events like press conferences. Unlike speeches, which can be ghostwritten, encounters with the press are unscripted and provide a means for assessing the language abilities of political leaders. Visar Berisha and his colleagues analyzed transcripts of press conferences for Presidents Reagan and George H. W. Bush, who aged healthily. As might be expected by now, the researchers found a reduction in Reagan’s vocabulary size over time, as well as an increase in his use of lexical fillers (well, uh) and less-specific nouns (we, they, something, anything). The researchers found no similar trends in Bush’s press conference remarks.50
Even historical figures who have been dead for centuries are having their written words scrutinized for signs of cognitive impairment. King James VI/I (1566–1625) also got the treatment. During his long reign, James united the Scottish and English crowns and sponsored the translation of the Bible that bears his name. He also left behind fifty-seven letters that cover a twenty-year period of his life. His letters show a pattern of decreasing grammatical complexity over time, but also an increase in vocabulary size. As we have seen, such changes are not unusual, but since he had circulatory problems, the authors of the study speculate that the king suffered from vascular dementia.51
Most writers never develop dementia. But its debilitating effects are echoed in Martin Amis’s quote that “writers die twice: once when the body dies, and once when the talent dies.”52
Lessons from the Nuns
In the early 1930s, two young women in Mankato, Minnesota, were asked to write short narratives about their lives. They were novices in the School Sisters of Notre Dame, a Roman Catholic order, and their mother superior had decreed that all novices write a brief autobiographical sketch before taking their vows.
The approaches taken by the two novices were very different. Consider the first sentence in the narrative written by “Sister Helen” (a pseudonym):
I was born in Eau Claire, Wis., on May 24, 1913, and was baptized in St. James Church.53
And now “Sister Emma”:
It was about a half hour before midnight between February twenty-eighth and twenty-ninth of the leap year nineteen-hundred-twelve when I began to live and to die as the third child of my mother, whose maiden name is Hilda Hoffman, and my father, Otto Schmitt.54
These sketches, along with many others, were filed away in the convent’s archives in Mankato. They would remain there undisturbed for decades, but when rediscovered, they would have an enormous impact on the field of cognitive aging.
In 1986, David Snowdon, an epidemiologist, then at the University of Minnesota, contacted the religious order in search of volunteers for an aging study. Because of their communal life in convents, eating the same food and working in the same profession, these teaching nuns constituted an ideal group for the study of aging. The nuns who agreed to take part completed a battery of cognitive tests each year. And crucially, they also consented to having their brains autopsied after they died. Because of this, it was possible to definitively assess whether any of the sisters had developed Alzheimer’s disease (AD). The project expanded to include six other Notre Dame convents, and ultimately 678 members of the order joined the study.
Then one day, quite by accident, Snowdon discovered the room that contained the archives of the Mankato convent. Within the archives, in a couple of olive green metal file cabinets, he found the records of the nuns who had taken their vows at Mankato. He unearthed a wealth of information, including the narrative sketches of Sisters Helen and Emma, as well as dozens of others. Suddenly, instead of being limited to studying the sisters from the beginning of his research, he now had a time machine; he could journey decades into their pasts. Specifically, he could look for clues in the ways the novice nuns had written about themselves more than a half-century earlier, and correlate the writing styles with the mental status of the nuns in the present.55
To analyze the autobiographies, Snowdon teamed with James Mortimer and Susan Kemper. They initially focused on two linguistic measures: idea density and grammatical complexity. Idea density was operationalized as the number of ideas expressed per ten words. In the earlier examples, Sister Helen’s sentence was scored as having an idea density of 3.9, whereas Sister Emma’s received a score of 3.3. Sister Helen’s grammatical complexity was coded in the lowest category, while Sister Emma’s received the highest possible score.
When Snowdon’s group published their first paper examining language and aging, fourteen of the nuns had passed away, and AD was confirmed in the brains of seven. Of the seven, the researchers found that all of them had written autobiographies with low idea density. None of the nuns whose essays contained high idea density had AD. And low grammatical complexity was also associated with AD, although not as strongly.56 By 2000, seventy-four of the nuns who had written essays had passed away, and once again, idea density in the essays was a strong predictor of a diagnosis of AD postmortem.57 A later study found an association between these linguistic factors and other declines in cognitive function, including mild cognitive impairment in old age.58
The essays also yielded another surprising finding: the expression of positive emotions was associated with longer life. Deborah Danner, a psychologist involved in the Nun Study, led a group that analyzed 180 of the nuns’ essays, identifying words associated with positive emotions (happy, love, hope), as well as those that denoted negative emotions (sad, afraid, anxious). The researchers found a seven-year difference in life expectancy between the nuns who used the largest number of positive emotion words and those who used the fewest.59
This pioneering work showing an association between linguistic factors and dementia has inspired others to undertake similar projects. A team at Utah State University has studied personal journals and letters that are part of the Cache County Memory Study. Intriguingly, the mere fact of keeping a journal was associated with a 53 percent risk reduction in all forms of dementia. However, the only linguistic variable associated with journal content and dementia was the percentage of long words (six letters or more). The use of long words was associated with a reduction in the risk of being diagnosed with AD.60
And what of the two nuns whose autobiographies were mentioned earlier? When they were assessed by Snowdon’s team in 1992 with a standard cognitive test, Sister Helen earned the lowest possible score: zero out of thirty points. She died a year later at age eighty, and a postmortem examination of her brain confirmed that she had AD. Sister Emma, who was fifteen months older, had received a perfect score on her test and, at the time of Sister Helen’s death, was still mentally sharp.61 In fact, some of the sisters enrolled in the study would maintain high levels of cognitive functioning into their nineties and beyond, even though autopsies of their brains detected the plaques and tangles typical of AD.62 Such findings demonstrate the protective factors of cognitive reserve that we discussed in the first chapter. By 2016, only 8 of the 678 women enrolled in the Nun Study were still alive, with the youngest being one hundred.63 It truly is remarkable that these women who dedicated their lives to teaching continue to instruct and inspire long after their deaths.
Fiction Is Stronger Than Truth
Fiction is the lie through which we tell the truth.
—attributed to Albert Camus (1913–1960)
Good fiction’s job [is] to comfort the disturbed and disturb the comfortable.
—David Foster Wallace (1962–2008), in an interview with Larry McCaffery
No one would claim that reading isn’t good for you; after all, it’s one of the best ways that people have for finding out about the world. The amount of information that we acquire through the written word is simply enormous. Not only that, but certain types of reading might confer psychological benefits that go far beyond a knowledge of history or the news of the day. Specifically, reading fiction might make us better human beings.
At first, this claim may seem far-fetched. How could novels or short stories tell us greater truths than nonfiction descriptions of events or ideas? What makes fiction special is that it exposes us to the psychological depths that exist in everyone. By reading about fictional characters, we develop a better understanding of the perspectives of other people, which in turn makes readers of fiction more socially perceptive.64 It’s been argued that reading fiction increases the empathy that we feel toward other people. In addition, reading fiction may also enhance what psychologists call one’s theory of mind (ToM). This refers to a person’s ability to understand the intentions, beliefs, and desires of other people. By mentally engaging with the complex motivations and actions of characters in fictional works, we may hone our abilities to see the world from points of view that differ from our own.
To test the supposed benefits of reading fiction, researchers have had to find ways to quantify these constructs. There are several tests for measuring empathy and the components of ToM, but determining how much fiction someone has read can be tricky. Asking people directly about their reading habits is problematic, because a strong component of social desirability may influence the answer: everyone wants to be perceived as being well read. So psychologists who study reading have created an “exposure to print” measure, which asks participants to pick out the names of fiction and nonfiction authors from a list that also contains names of nonwriters.65 Several studies have found positive associations between measures of empathy, ToM, and exposure to print. A recent reanalysis of these studies by the psychologists Micah Mumper and Richard Gerrig has confirmed this finding, although the effect was modest.66 These findings are also in line with research by Zazie Todd, who recruited participants to read novels and discuss them as a group. In the discussions, a common theme was expressions of sympathy and empathy for the fictional characters.67
If cumulative exposure to the written word has measurable effects on more global cognitive and social processes, then we might expect to see other specific advantages for reading, and we do. Changes in the connectivity of the brain have been observed as research participants read a work of historical fiction over a period of three weeks. These beneficial changes were observed in the short term, during the study itself, and also in the days after the reading concluded.68
The psychologist Mei-Ching Lien and her colleagues compared the word recognition abilities of college students and a group of older adults in their sixties and seventies. The researchers asked the participants to identify words while simultaneously responding to competing visual and auditory tasks. Doing two things at once normally requires additional cognitive resources, which can be problematic for older adults. In this study, however, the older participants outperformed their younger counterparts on these competing tasks. The researchers suggest that a lifetime of exposure to print may lower the resources required to access words in memory—a facility that the college students have not yet acquired.69 Other researchers have found additional advantages of exposure to print for older adults, such as reducing the limitations of working memory while reading. These are important findings, because they suggest that being an inveterate reader compensates for declines in other cognitive processes.70 It is clear, therefore, that reading fiction can benefit adults in a wide variety of ways.
Amazingly, reading fiction has also been associated with a longer life. A large-scale study conducted by researchers at the Yale University School of Public Health found that book readers had a twenty-three-month “survival advantage” compared to non–book readers. Put another way, the study found a 20 percent lower mortality rate over a twelve-year period for book readers. The study was, by definition, correlational; after all, researchers cannot randomly assign people to book reading and non-book-reading conditions for years at a time. Therefore we cannot distinguish between the competing hypotheses that more reading leads to a longer life, or longer life leads to more reading. In addition, many factors are undoubtedly involved in this association, although the beneficial effect of reading persisted even after the researchers controlled for a host of potential moderators, such as age, gender, education, marital status, health, and affluence. The beneficial effect was driven primarily by the reading of books, as opposed to reading magazines and newspapers. And since most book readers read fiction, this ties into the other advantages mentioned earlier. In fact, the study showed that as little as thirty minutes a day spent reading books was beneficial. The authors of the study suggest that the greater cognitive engagement required for reading books—as opposed to periodicals—accounts for this longevity effect.71