2
FROM WOMB TO TOMB
Myths about Development and Aging
Myth #6 Playing Mozart’s Music to Infants Boosts Their Intelligence
Few qualities—or quantities—are more prized in American society than intelligence and intellectual accomplishment. When it comes to academic achievement, parents love to win bragging rights on their children’s behalf. Just look at car bumper stickers: “My Child is an Honor’s Student at East Cantaloupe High School,” “Proud Parent of an Honor’s Student at North Igloo Elementary,” or for laughs, “My French Poodle is Smarter than Your Honor’s Student.” In today’s cutthroat world, many parents are understandably eager to lend their children a competitive advantage over their classmates. This undeniable fact raises an intriguing question: Could parents give their children a jump-start by stimulating them intellectually in infancy, perhaps only a few months, weeks, or even days after birth?
This may sound like the stuff of a futuristic science fiction novel. Yet it seemingly turned into reality in 1993 with the publication of an article in one of the world’s premier science journals, Nature. In that paper, three University of California at Irvine researchers reported that college students who listened to a mere 10 minutes of a Mozart piano sonata displayed a significant improvement on a spatial reasoning task—a test involving paper folding and cutting—compared with a group of students who listened to either a relaxation tape or to silence (Rauscher, Shaw, & Ky, 1993). The overall improvement translated into a boost of about 8 or 9 IQ points. The Mozart Effect—a term coined by physician Alfred Tomatis (1991) and later popularized by educator and musician Don Campbell (1997) to refer to the supposed enhancement in intelligence after listening to classical music—was born.
The 1993 finding didn’t imply anything about the long-term enhance ment of spatial ability, let alone intelligence in general. It applied only to one task administered almost immediately after listening to Mozart’s music. Nor did the finding imply anything about the effects of Mozart’s music on infants, as the original study examined only college students.
But this didn’t stop the popular press or toy companies from picking up the Mozart Effect ball and running with it. Based entirely on specu lation that the original findings might apply to infants, companies soon began to market scores of Mozart Effect CDs, cassettes, and toys targeted toward babies. By 2003, Don Campbell’s popular Mozart Effect CDs had sold over 2 million copies (Nelson, 2003). As of 2008, Amazon.com featured over 40 products, mostly CDs and cassettes, on the Mozart Effect, many of which proudly feature young children or newborn infants on their covers.
In addition to the mass marketing of scores of Mozart Effect products to receptive parents, another reason for this effect’s popularity may stem from a confusion between correlation and causation (see Introduction, p. 13). Studies show that musical talent tends to be positively asso ciated with IQ (Lynn, Wilson, & Gault, 1989). Some people may erroneously leap from this correlational finding to the conclusion that exposure to music increases IQ.
As psychologists Adrian Bangerter and Chip Heath (2004) observed, the Mozart Effect claim spread through society much like a message passes through a game of telephone, becoming increasingly distorted and often exaggerated over time. One 2000 article in a Chinese newspaper claimed that “According to studies conducted in the West,” babies who listen to Mozart masterpieces “during gestation are likely to come out of the womb smarter than their peers” (South China Morning Post, 2000, as cited in Bangerter & Heath, 2004). Yet no published studies conducted in the West or elsewhere had ever examined the effects of Mozart’s music on humans in utero. A 2001 article in the Milwaukee Journal Sentinel referred to “numerous studies on the Mozart effect and how it helps elementary school students, high school students, and even infants increase their mental performance,” despite the fact that no researchers had investigated the effects of Mozart’s music on any of these groups (Krakovsky, 2005).
These widespread media reports appear to have had an effect on public perception; two surveys revealed that over 80% of Americans were familiar with the Mozart Effect (Bangerter & Heath, 2004). A survey of introductory psychology students revealed that 73% believed that “listening to Mozart will enhance your intelligence” (Taylor & Kowalski, 2003, p. 5). Several years ago, the coach of the New York Jets football team arranged for Mozart’s music to be played through loudspeakers during practice sessions in an effort to enhance their performance. A New York community college even set aside a Mozart Effect study room for its students.
The Mozart Effect eventually reached the hallowed halls of state legis latures. In 1998, then Georgia Governor Zell Miller added $105,000 to the state budget to allow each newborn child in Georgia to receive a Mozart CD or cassette free of charge, announcing his bold initiative over the inspiring strands of Beethoven’s Ninth Symphony (Mercer, 2010; Sack, 1998). According to Miller, “No one questions that listening to music at a very early age affects the spatial-temporal reasoning that under lies math and engineering and even chess.” Tennessee governor Don Sundquist soon followed suit, and the Florida State Senate likewise passed a bill requiring day care centers that received state funding to play classical music to infants on a daily basis (State of Florida Senate Bill 660, May 21, 1998).
But all of this implies that the Mozart Effect is real. Is it?
Several investigators who tried to replicate the original Nature find ings reported either no effect or a miniscule one (Gray & Della Sala, 2007; McKelvie & Low, 2002). Analyses that combined the results across multiple studies revealed that the Mozart Effect was trivial in magnitude —2 IQ points or less—and of trivial duration, typically an hour or less (Chabris, 1999; Steele, Bass, & Crook, 1999). Some researchers began to claim that the Mozart Effect materialized only with certain pieces of Mozart’s music, but not others, but other researchers never confirmed these assertions. Moreover, none of the published studies examined children, let alone infants, who were the supposed beneficiaries of the Mozart Effect. Georgia governor Zell Miller (1999) urged advocates of the Mozart Effect to ignore these negative findings, reassuring them not “to be misled or discouraged by some academics debunking other academics.” But this is precisely how science works at its best: by refut ing, correcting, or revising claims that haven’t stood up to careful scrutiny.
Later researchers helped to pin down the source of the Mozart Effect. In one study, they asked students to listen to an uplifting piece by Mozart, a depressing piece by another classical composer (Albinoni), and silence (Thompson, Schellenberg, & Husain, 2001). Immediately afterwards, the investigators gave participants a paper folding and cutting task. The Mozart piece improved performance on this task relative to the two control conditions, but it also enhanced emotional arousal relative to these conditions. When the researchers used statistical techniques to equalize for the effects of emotional arousal across the three experimental conditions, the Mozart Effect vanished. The results of another study demonstrated that listening to Mozart was no better for improving spatial ability than listening to a passage from a scary story by horror writer Stephen King (Nantais & Schellenberg, 1999).
These findings suggest an alternative explanation for the Mozart Effect: short-term arousal. Anything that heightens alertness is likely to increase performance on mentally demanding tasks (Jones, West, & Estell, 2006; Steele, 2000), but it’s unlikely to produce long-term effects on spatial ability or, for that matter, overall intelligence. So listening to Mozart’s music may not be needed to boost our performance; drinking a glass of lemonade or cup of coffee may do the trick.
The bottom line: The Mozart Effect may be “real” in the sense that it enhances immediate performance on certain mental tasks. But there’s no evidence that this has anything to do with Mozart’s music, or even music at all (Gray & Della Sala, 2007). Nor is there evidence that it increases intelligence in adults, let alone infants. Of course, introducing children to the music of Mozart and other great composers is a wonder ful idea, not only because such music can be uplifting, but because it’s had an immense influence on Western culture. But parents hoping to transform their babies into geniuses by exposing them to the soundtrack of Amadeus are best advised to save their money.
The popular craze following in the wake of the Mozart effect wasn’t the first time that entrepreneurs capitalized on eager parents’ desires to boost their infants’ intellects. Many of these marketers seized on wide spread, but poorly supported, claims that the first three years of life are especially crucial in infants’ intellectual development (Bruer, 1997; Paris, 2000). In the 1980s, thousands of parents introduced their new born infants to hours of foreign languages and advanced mathematics in a concerted effort to create “superbabies” (Clarke-Stewart, 1998). But no superbabies emerged. Today, such alleged intelligence-improving products as “Baby Einstein” toys and videos are a $100 million a year industry (Minow, 2005; Quart, 2006). Yet there’s no good evidence that these products work either. To the contrary, research suggests that babies learn less from videos than from playing actively for the same time period (Anderson & Pempek, 2005).
The work of the great Russian developmental psychologist Lev Vygotsky may help to explain why these products are doomed to fail. As Vygotsky (1978) observed, learning occurs best within a “zone of proximal development,” in which children can’t yet master a skill on their own but can do so with help from others. If 3-year-old children don’t possess the cognitive skills to learn calculus, no amount of exposure to calculus will increase their math abilities, let alone transform them into superbabies, because calculus lies outside their zone of proximal development. Much as impatient parents might want to hear otherwise, children can’t learn until their minds are ready.
Myth #7 Adolescence Is Inevitably a Time of Psychological Turmoil
In a recent weekly newspaper advice piece, an exasperated mother wrote to ask the columnist, Hap LeCrone (2007), to explain what had happened to her now 11-year-old daughter, who was until recently an easy-going and happy child. “If we like something, she hates it,” the mother wrote. Her daughter “doesn’t want to accompany us anywhere,” and “her responses to us are not often very civil.” What’s more, “getting her to keep her room straight or dress nicely is likely pulling teeth,” and “back talk is the norm.” What on the earth, the mother wondered, is going on? LeCrone responded succinctly: “Some parents call what you are going through the disease of adolescence.”
The view that adolescence is always or almost always a time of emotional turmoil is hardly new. Psychologist G. Stanley Hall (1904), the first president of the American Psychological Association, was also the first to refer to adolescence as a time of “storm and stress.” Hall borrowed this term from the 18th century German “Sturm and Drang” movement in music, art, and literature, which emphasized the expression of passionate and often painful emotions. Later, Anna Freud (1958), daughter of Sigmund Freud and a prominent psychoanalyst in her own right, popularized the view that adolescent emotional upheaval is pervasive (Doctors, 2000). She wrote (A. Freud, 1958, p. 275) that “to be normal during the adolescent period is by itself abnormal” (p. 267) and “adolescence is by its nature an interruption of peaceful growth” (p. 275). For Anna Freud, the teenager who experiences minimal dis tress is actually pathological, and is at greatly heightened risk for psychological problems in adulthood.
Today’s pop psychologists have fueled the perception that the teen age years are usually times of high family drama. For example, the promotional copy for parenting expert Dr. James Dobson’s (2005) book,
Preparing for Adolescence, informs readers that it will “help teens through the rough years of adolescence” and help “parents who want to know what to say to a child who’s getting ready to enter those turbulent teenage years.” A television show on adolescence featuring “Dr. Phil” (Phil McGraw) warned viewers that “the teenage years can be a parent’s worse nightmare” and promised to discuss “ways for par ents and teens to survive adolescence.”
The stereotype of the “terrible teen” years is echoed in much of the entertainment media. Dozens of films, including Rebel Without a Cause (1955), Ordinary People (1980), Kids (1995), Girl, Interrupted (1999), and Thirteen (2003), focus on the plight of troubled adolescents, and the title of a 2002 British television series, Adolescence: The Stormy Decade, speaks for itself. In addition, such bestselling novels as J. D. Salinger’s A Catcher in the Rye (1951) capture the pain and confusion of the teenage years.
Because books and movies focus far more often on tales of troubled than healthy adolescents—a Hollywood film about an entirely normal teenager is unlikely to make for an interesting storyline, let alone hefty box office receipts—the public is routinely exposed to a biased sampling of teenagers (Holmbeck & Hill, 1988; Offer, Ostrov, & Howard, 1981). Perhaps not surprisingly, many laypersons believe that adolescence is usually a time of storm and stress. As psychologist Albert Bandura (1964) noted, “If you were to walk up to the average man on the street, grab him by the arm and utter the word ‘adolescence,’ it is highly probable … that his associations of this term will include references to storm and stress, tension, rebellion, dependency conflicts, peer-group conformity, black leather jackets, and the like” (p. 224).
Bandura’s informal observations are borne out by surveys of college students. Grayson Holmbeck and John Hill (1988) found that students enrolled in an undergraduate course on adolescence scored an average of 5.2 (out of 7) on the item “Adolescence is a stormy and stressful time.” Parents and teachers hold similar views (Hines & Paulson, 2006). This position is widespread even among health professionals. One survey of staff in a pediatric hospital revealed that 62% of medical residents (doctors in training) and 58% of nurses agreed that “the majority of adolescents show neurotic or antisocial behavior sometime during adolescence.” In addition, 54% of medical residents and 75% nurses agreed that “Doctors and nurses should be concerned about the adjust ment of the adolescent who causes no trouble and feels no disturbances,” mirroring Anna Freud’s position that the “normal” adolescent is actually abnormal (Lavigne, 1977).
To evaluate claims regarding adolescent storm and stress, we need to examine three domains of teen behavior: (1) conflicts with parents, (2) mood instability, and (3) risky behavior (Arnett, 1999). Research shows that like several other myths in this book, the adolescent storm and stress claim possesses a kernel of truth, which probably accounts in part for its popularity. At least in American society, adolescents are indeed at somewhat elevated risk for difficulties across all three domains (Arnett, 1999; Epstein, 2007). Conflicts with parents escalate during the teen years (Laursen, Coy, & Collins, 1998), teens report more mood changes and more extreme moods than do non-teens (Buchanan, Eccles, & Becker, 1992; Larson & Richards, 1994), and teens take more physical risks than do non-teens (Reyna & Farley, 2006; Steinberg, 2007). So it’s true that adolescence can be a time of heightened psychological struggles for some teens.
But note that we italicized “some.” The same data show overwhelm ingly that each of these difficulties is confined to only a small minority of teens. Most studies indicate that only about 20% of adolescents undergo pronounced turmoil, with the substantial majority experiencing gener ally positive moods and harmonious relations with their parents and peers (Offer & Schonert-Reichl, 1992). Furthermore, marked emotional upset and parental conflict are limited largely to adolescents with clear-cut psychological problems, like depression and conduct disorder (Rutter, Graham, Chadwick, & Yule, 1976), as well as to adolescents who come from disrupted family backgrounds (Offer, Kaiz, Ostrov, & Albert, 2003). So the claim that adolescent angst is either typical or inevitable doesn’t hold up (Epstein, 2007). To the contrary, it’s the exception rather than the rule. In addition, one study that followed 73 adolescent males over a 34-year period found not a shred of evidence that well-adjusted teens are at heightened risk for psychological problems later in life (Offer et al., 2002). These findings put the lie to Anna Freud’s claims that seem ingly normal teens are actually abnormal and destined for psychological trouble in adulthood.
Further contradicting the view that teen storm and stress are inevit able are cross-cultural data showing that adolescence is a time of relative peace and calm in many traditional and non-Western societies (Arnett, 1999; Dasen, 2000). For example, in Japan and China, the teenage years usually pass without incident. In Japan, 80–90% of teens describe their home lives as “fun” or “pleasant” and report positive relations with their parents. We can find a similar absence of significant teenage turmoil in India, sub-Saharan Africa, Southeast Asia, and much of the Arab world (Epstein, 2007). Moreover, there’s evidence that increasing Westernization in these areas is associated with increasing adolescent distress (Dasen, 2000). We don’t know why adolescent turmoil is more common in Western than in non-Western cultures. Some authors have suggested that because parents in Western cultures, in contrast to most non-Western cultures, tend to treat their teenagers more like children rather than as maturing adults with grown-up rights and responsibilities, they may rebel against their parents’ restrictions and behave antisocially (Epstein, 2007).
Can erroneous beliefs about the inevitability of adolescent turmoil do any harm? Perhaps. Dismissing some adolescents’ genuine problems as merely a “passing phase” or as a manifestation of a normal period of turmoil may result in deeply troubled teens not receiving the psycho logical assistance they sorely need (Offer & Schonert-Reichl, 1992). Admittedly, some teenagers’ cries for help are manipulative ploys to garner attention, but many others are signs of desperate youths whose suffering has been ignored.
Myth #8 Most People Experience a Midlife Crisis in | 8 Their 40s or Early 50s
A 45-year-old man buys the Porsche he’d dreamt about owning for years, sports a new beard, gets hair plugs, leaves his wife for a 23-year-old woman, and takes out a chunk of his retirement savings to travel to the Himalayas to study with the guru du jour. Many people in our society would chalk up his uncharacteristic behaviors to a “midlife crisis,” a period of dramatic self-questioning and turbulence in middle age (40 to 60 years old), as one confronts mortality, physical decline, and unful filled hopes and dreams.
The idea that many people experience a difficult life transition when poised roughly midway between birth and death isn’t of recent vintage. In the 14th century, the first lines of Alighieri Dante’s (1265–1321) epic poem the Divine Comedy evoked the idea of a midlife crisis:
Midway upon the journey of our life I found
myself within a forest dark,
For the straightforward pathway had been lost.
But it wasn’t until 1965 that Elliott Jacques coined the term “midlife crisis” to describe the compulsive attempts to remain young and defy the reality of death that he observed in middle-aged artists and composers. Jacque served up this catchy phrase for the public and scientific commun ity to describe virtually any unsettling life transition people experience in middle age. A decade later, Gail Sheehy’s (1976) bestselling book, Passages: Predictable Crises of Adult Life, cemented the idea of a mid-life crisis in the public imagination. By 1994, 86% of young adults sur veyed believed in the reality of a “midlife crisis” (Lachman, Lewkowicz, Marcus, & Peng, 1994).
The film industry has pounced all over the idea of a turbulent period in midlife by depicting goofy and screwed up, yet likeable, middle aged guys—the protagonists are mostly male—who question the meaning and value of their lives. In City Slickers (1991), three men (played by Billy Crystal, Daniel Stern, and Bruno Kirby), all experiencing a midlife crisis, take a 2-week break from their humdrum lives to go on a cattle drive from New Mexico to Colorado. A more recent riff on the same theme, the movie Wild Hogs (2007), portrays the adventures of four middle-aged men who hit the road on motorcycles to rekindle the excitement of their youth. No movie captures the supposed rut of middle age better than Groundhog Day (1993), in which comedian Bill Murray portrays Phil Connors, a heavy drinking, self-absorbed weatherman, who’s fated to repeat the same day, every day, until he finally “gets” that his life can have meaning when he becomes a better person. In Bull Durham (1988), Kevin Costner portrays baseball player “Crash” Davis, exiled to the minor leagues to coach a talented young player. Crash is keenly aware of his youth sliding away, much like his waning ability to slide safely into home plate, but he eventually finds love and fulfillment with baseball groupie Annie Savoy (played by Susan Sarandon). In the Academy Award-winning movie, American Beauty (1999), Lester Burnham (played by Kevin Spacey) displays all of the stereotypic hall marks of a male midlife crisis. He quits his high pressure job to work as a burger turner, starts to use drugs and works out, buys a sports car, and becomes infatuated with his teenage daughter’s girlfriend.
The Internet and books provide advice to help people negotiate not only their midlife crisis but their spouse’s crisis as well. That’s right: Women aren’t immune to midlife angst either. The Internet site for the Midlife Club (http://midlifeclub.com/) warns its visitors that: “Whether it’s your midlife crisis, or the midlife crisis of someone you love, whether you’re a man or a woman—you’re in for a bumpy ride!” The club peddles books in which men and women who “made it through the crisis” share their wisdom, strategies, and stories with one another. For $2,500, you can purchase “LifeLaunch” through the Hudson Institute of Santa Barbara (http://www.hudsoninstitute.com). For that steep price, you can obtain intensive coaching to guide you through your midlife crisis with “vision, direction, and thoughtful planning” as you “reflect on all that you bring to the next chapter of your life.” At the other extreme of the price spectrum, you can buy Overcome Midlife Crisis for only $12.95 from HypnosisDownloads with a 100% 90-day money-back guarantee (no questions asked) and a promise that you’ll “Get rid of those midlife crisis feelings and grasp life by the horns again” (http://www.hypnosisdownloads.com/downloads/hypnotherapy/midlife-crisis.xhtml).
Psychologist Ian Gotlib (Gotlib & Wheaton, 2006) reviewed headlines and feature articles in The New York Times Living Arts section for 15 months. He discovered that editors used the term “midlife crisis” an average of twice a month to headline reviews of books, films, and tele vision programs.
In addition to Internet and media coverage, another reason why the notion of a midlife crisis may persist is that it’s based on a shard of truth. Psychologist Erik Erikson (1968) observed that in middle adult hood, most people grapple with finding direction, meaning, and purpose in their lives, and they strive to find out whether there’s a need for a mid-course correction. We’ll see that Erikson exaggerated the prevalence of a crisis in middle age, but he was right that some people experience marked self-doubt in the intermediate years of life. Yet people reevalu-ate their goals and priorities and experience crises in every decade of life, as evidenced by the emotional tumult some (but by no means all; see Myth #7) teens experience. Moreover, the experiences that fall under the umbrella of the “midlife crisis” are very broad—such as change of job, divorce, buying a sports car—and nebulous. As a consequence, one could consider most any upheaval or life change proof positive of a midlife meltdown.
Some “symptoms” of a midlife crisis, such as divorce, are actually more likely to occur prior to middle age. In the United States, people first divorce, on average, within 5 years of marriage, at age 33 for men and 31 for women (Clarke, 1995). Moreover, when people purchase their fantasy sports car in their 40s, it may have nothing to do with making the best of a crisis. Rather, they may finally be able to make the payments on the car for which they longed as teenagers.
Studies across cultures provide no fodder for the idea that middle age is a particularly stressful and difficult period. In a study of 1,501 Chinese married adults between 30 and 60 years old, Daniel Shek (1996) failed to find high levels of dissatisfaction approaching a “crisis” in the majority of middle-aged men and women. Researchers funded by The Mac Arthur Foundation studied a total of nearly 7,195 men and women aged 25 to 74, of whom 3,032 were interviewed in the largest study of people at midlife (Brim, Ryff, & Kessler, 2004). Contrary to the popular stereotype, people in the 40 to 60 age range generally felt more in control of their lives and expressed greater feelings of well-being compared with the previous decade of their lives. In addition, more than three quarters of respondents rated their relationships as good to excellent. Men and women were equally likely to experience what they considered to be a midlife crisis. The researchers found that concerns about having a midlife crisis were more common than actually experienc ing a crisis.
Mythbusting: A Closer Look
The Empty Nest Syndrome
A mother goes into her son’s bedroom to sniff his T-shirt shortly after he leaves for college for the first time. On a website (http://www.netdoctor.co.uk/womenshealth/features/ens.htm) that recounts her unusual behavior, we learn that it’s a perfectly normal expression of the “empty nest syndrome,” a term referring to the popular belief that most women feel disturbing pangs of depression when their chil dren leave home or get married. The popular “Chicken Soup for the Soul” self-help series even features a book devoted entirely to help ing “empty nesters” adapt to the stress of their transition (Canfield, Hansen, McAdoo, & Evans, 2008).
Actually, there’s scant scientific support for the popular belief that women experience the female equivalent of the male midlife crisis when their children fly the coop, leaving the proverbial nest empty. Christine Proulx and Heather Helms (2008) interviewed 142 sets of parents after their firstborn children left home. Most parents (both men and women) made an excellent adjustment, felt the move was positive, and related more to their children as peers when they achieved greater independence. Moreover, most empty nesters actually experience an increase in life satisfaction following their newfound flexibility and freedom (Black & Hill, 1984). Recent evidence tracking marital relationships over an 18-year period points to an increase in marital satisfaction too (Gorchoff, John, & Helson, 2008).
A shift in household roles, and a sudden increase in free time, can require some adjustment for all family members. People who define themselves largely in terms of their role as parents, hold traditional attitudes toward women’s roles in society and the family, and aren’t employed outside the home may be particularly vulnerable to empty nest syndrome (Harkins, 1978). But a child “moving on” isn’t typically a devastating experience for parents, as it’s often portrayed in the media (Walsh, 1999). In fact, as children make a successful transition to young adulthood, and parents reap the rewards of many years of dedicated work raising their children, it can be an occasion for celebration.
Several other findings debunk the myth of the midlife crisis. Across studies, only 10–26% (depending on how scientists define the midlife crisis) of people report they’ve experienced a midlife crisis (Brim, 1992; Wethington, 2000). In addition, middle age can be a period of peak psy chological functioning (Lachman, 2003). Clearly, a midlife crisis isn’t a prospect for everyone, or even a likely occurrence. So if you want to make radical changes in your life, and buy a red sports car or a “wild hog” motorcycle, it’s never too early—and never too late—to do so.
Myth #9 Old Age Is Typically Associated with Increased Dissatisfaction and Senility
Think of a person who matches this description: cranky, eccentric, cantankerous, afraid of change, depressed, unable to keep up with tech nology, lonely, dependent, physically infirm, and forgetful. We certainly wouldn’t be shocked if an elderly person came to mind—perhaps hunched, shrunken, and doddering—because the descriptors we’ve pro vided fit to a T popular yet inaccurate stereotypes of the elderly (Falchikov, 1990; Middlecamp & Gross, 2002).
Many people assume that a large proportion of the elderly is depressed, lonely, and irritable, lacking in sexual desire, and either senile or displaying early signs of it. Sixty-five percent of a sample of 82 intro ductory psychology students agreed that “most older people are lonely and isolated” and 38% that “When people grow old, they generally become ‘cranky’“ (Panek, 1982, p. 105). In addition, 64% of a sample of 288 medical students said that “major depression is more prevalent among the elderly than among younger persons” (van Zuilen, Rubert, Silverman, & Lewis, 2001).
Media exposure to stereotypes—we might even say indoctrination —about the aged begins early in life (Towbin et al., 2003). In their study of Disney children’s films, Tom Robinson and his colleagues (Robinson, Callister, Magoffin, & Moore, 2007) found that 42% of elderly characters like Belle’s father from Beauty and the Beast and Madam Mim from the Sword and the Stone (and let’s not forget “Grumpy,” one of the seven dwarves in Snow White) are portrayed in a less than positive light, and as forgetful, angry, or crotchety. Children bom barded with these and other negative stereotypes may understandably develop unfavorable impressions of seniors that begin to crystallize at an early age.
The relentless barrage of misinformation about aging persists through adulthood. In a study of popular teen movies, most elderly characters exhibited some negative characteristics, and a fifth fulfilled only negative stereotypes (Magoffin, 2007). The depressing and occasionally frightening image of aging extends to adult-oriented cartoons, television programs, and movies. Consider Grandpa Simpson from the popular television pro gram, who was born in the “old country” but can’t seem to remember which country. Or mobster Tony Soprano’s offbeat family: his mother Livia (played by Nancy Marchand in the popular television program The Sopranos), who tried to have Tony (played by James Gandolfini) “hit” because he put her in a nursing home (“… it’s a retirement com munity, Ma!”), and his demented Uncle Junior (played by Dominic Chianese), who shot Tony thinking he was an enemy who’d died 20 years earlier. In the movie The Savages (2007), a son and daughter, played by Philip Seymour Hoffman and Laura Linney, respectively, struggle with their ambivalence about taking care of their elderly father (played by Philip Bosco) as he deteriorates in physical and mental health, playing with his feces and becoming increasingly forgetful.
With media fear-mongering about the seemingly inevitable ravages of aging, it’s scarcely any wonder that myths about senior citizens abound and prejudice against the elderly runs deep. John Hess (1991) chron icled how the media blame the elderly unfairly for many social and political ills, including high taxes, bankrupting the national budget due to the high costs of medical care and social security, and cutbacks on programs for children and the disabled. Surveys suggest that the emo tion most college students feel toward the elderly is pity (Fiske, Cuddy, Glick, & Xu, 2002). Moreover, people rate memory problems in the elderly as signs of mental incompetence, but consider memory problems in younger individuals as due to inattention or a lack of effort (Cuddy & Fiske, 2002).
Sharply contradicting these perceptions, research demolishes the myth that old age (beginning at age 60–65) is typically associated with dissatisfaction and senility. One team of investigators surveyed adults between the ages of 21 and 40 or over age 60 about their happiness and the happiness of the average person at their current age, age 30, and at age 70. The young adults predicted that people in general would be less happy as they aged. Yet the older adults were actually happier at their current age than were younger respondents (Lacey, Smith, & Ubel, 2006).
Population-based surveys reveal that rates of depression are actually highest in individuals aged 25–45 (Ingram, Scott, & Siegle, 1999), and that the happiest group of people is men aged 65 and older (Martin, 2006). Happiness increases with age through the late 60s and perhaps 70s (Mroczek & Kolarz, 1998; Nass, Brave, & Takayama, 2006). In one study of 28,000 Americans, a third of 88-year-olds reported they were “very happy,” and the happiest people surveyed were the oldest. The odds of being happy increased 5% with every decade of life (Yang, 2008). Older people may be relatively happy because they lower their expectations (“I’ll never win a Nobel Prize, but I can be a wonderful grandparent”), accept their limitations, and recall more positive than negative information (Cartensen & Lockenhoff, 2003).
Although depression isn’t an inevitable consequence of aging, it still afflicts about 15% of the elderly. But many cases of depression in this age group are probably due not to biological aging itself, but to medical and pain conditions, the side effects of medications, social isolation, and such life events as the death of a close friend (Arean & Reynolds, 2005; Kivela, Pahkala, & Lappala, 1991; Mroczek & Spiro, 2005).
Contrary to the myth of older people as lacking in sexual desire, a national survey (Laumann, Das, & Waite, in press) of about 3,000 people indicated that more than three quarters of men aged 75 to 85 and half of their women counterparts reported still being interested in sex. Moreover, 73% of people aged 57 to 64 years were sexually active, as were most people (53%) aged 64 to 74 years. Even in the oldest group, people aged 75 to 85 years, 26% reported still being sexually active. Interestingly, health problems, such as obesity and diabetes, were better predictors than aging itself of which people stayed sexually active. As overall health declined, so did sexual activity.
Although depression and ebbing sexual desire don’t coincide with the arrival of an AARP card in the mail, people are naturally wary of the aging process in general, and memory loss in particular. Many websites poke fun at the elderly by quoting the Senility Prayer: “God, Grant me the senility to forget the people I never liked anyway, the good fortune to run into the ones I do, and the eyesight to tell the difference.” Not surprisingly, popular books address, if not prey on, fears of aging. For example, Zaldy Tan’s (2008) book title promises to Age-Proof Your Mind: Detect, Delay, and Prevent Memory Loss—Before It’s Too Late. A Nintendo game called Brain Age supposedly permits players to lower their “brain age” through mental exercises that activate their brain’s pre-frontal cortex (Bennallack, 2006).
It’s natural to experience some slight memory loss as we age, includ ing minor forgetfulness and difficulty retrieving words in conversational speech. But severe memory loss associated with Alzheimer’s disease and other forms of dementia that impair our ability to function isn’t a typical consequence of aging. People with Alzheimer’s disease experience getting lost in familiar places, personality changes, loss of language skills, difficulty in learning, and problems in completing simple daily tasks. Alzheimer’s disease afflicts as many as 4 million Americans, and the disease can last from 3 to 20 years, with the average duration being 8 years (Neath & Surprenant, 2003). As people get older, their risk of Alzheimer’s increases. Yet some people in their 30s and 40s develop Alzheimer’s, and even after age 85, about three quarters of the elderly don’t experience significant memory problems (U.S. Department of Health and Human Services, 2007).
Even at age 80, general intelligence and verbal abilities don’t decline much from younger ages, although memory for words and the ability to manipulate numbers, objects, and images are somewhat more prone to age-related declines (Riekse & Holstege, 1996). Furthermore, research on creative accomplishments indicates that in some disciplines, like history or fiction writing, many people produce their highest quality work in their 50s or several decades beyond (Rabbitt, 1999). Exercising, eat ing a healthy diet, solving puzzles, and staying intellectually active may slow or compensate for minor losses of cognitive prowess as people age (Whitbourne, 1996), although researchers haven’t established the effectiveness of “Brain Age” and similar products.
A final misconception about the elderly is that they’re unable to acquire new skills or are befuddled by modern gadgets. As the saying goes, “You can’t teach an old dog new tricks.” In the introductory psychology student sample we mentioned earlier, 21% agreed that “older people have great difficulty learning new skills” (Panek, 1982, p. 105). The media occasionally spoofs this image of aging people. A good example is eccentric Arthur Spooner (played by Jerry Stiller) in the television pro gram King of Queens, who doesn’t know to use a DVD. But many older people aren’t intimidated by computers, iPhones, and other “newfangled devices,” and have the inclination and time to master and appreciate them. So to tweak an old (pun intended) saying, “You can teach an old dog new tricks … and a whole lot more.”
Myth #10 When Dying, People Pass through a Universal Series of Psychological Stages
DABDA.
Across the United States, scores of psychologists, psychiatrists, nurses, and social workers who work with the elderly commit this acronym to memory. DABDA stands for the five stages of dying popularized by Swiss-born psychiatrist Elisabeth Kübler-Ross (1969) in the late 1960s: Denial, Anger, Bargaining, Depression, and Acceptance. These stages, often called the “Five Stages of Grief,” supposedly describe an invariant sequence of stages that all people pass through when dying (Kübler-Ross, 1969, 1974). According to Kübler-Ross, when we learn we’re about to die, we first tell ourselves it’s not happening (denial), then become angry at the realization that it really is happening (anger), then search in vain for some way to postpone the death, perhaps at least until we can accomplish a long-valued goal (bargaining), then become sad as the realization that we’re dying sets in (depression), and finally fully come to grips with our inevitable death and approach it with a sense of serenity (acceptance).
Kübler-Ross’s stages of grief are widely accepted in the medical, psy chological, and nursing communities. Surveys indicate that these stages are taught to large proportions of medical, nursing, and social work students in the United States, Canada, and the UK (Downe-Wamboldt & Tamlyn, 1997; Holleman, Holleman, & Gershenhorn, 1994).
Her stages are also a common fixture in popular culture. The award-winning 1979 film All That Jazz, based loosely on the life of choreo grapher Bob Fosse, featured the five Kübler-Ross stages in a dramatization of Fosse’s imagined death. In season 6 of the television program Frasier, Frasier passes through all five stages of grief after losing his job as a radio talk-show psychologist. In a hilarious depiction of Kübler-Ross’s framework in the cartoon program The Simpsons, Homer Simpson passes through all five of stages in a matter of seconds after a doctor informs him (erroneously) that he’s dying. These stages are even popu lar in the political arena. One Internet blogger likened the waning days of George W. Bush’s presidency to each of the five Kübler-Ross stages (Grieser, 2008; http://www.democracycellproject.net/blog/archives/2008/02/kubler_ross_stages_as_applied_to_our_national_grief.xhtml), and New York Times columnist Maureen Dowd (2008) sought to explain Hillary Clinton’s reluctance to accept her Democratic nomination loss to Barack Obama in the summer of 2008 in terms of Kübler-Ross’s first several stages.
Kübler-Ross’s stages may be popular not merely because of the exten sive media coverage they’ve attracted, but because they offer people a sense of predictability over the previously unpredictable—the process of dying (Copp, 1998; Kastenbaum, 1998). The thought that the often terrifying experience of death follows a standard series of stages, end ing in a sense of tranquil acceptance over one’s fate, strikes many of us as reassuring. Moreover, the idea that death unfolds in the same neat and tidy way for everyone is somehow appealing, perhaps because it simplifies a mysterious process. But is it true?
Given the ubiquity of the Kübler-Ross stages in popular psychology, we might think they’d been extensively validated by psychological research. If so, we should think again. In fact, as is the case for many “stage theories” in psychology, the scientific support for these stages has been at best mixed (Kastenbaum, 2004). In retrospect, this largely negative scientific evidence shouldn’t have been all that surprising, because Kübler-Ross’s (1969) claims regarding her five stages weren’t based on carefully controlled research. In particular, her research was based almost entirely on potentially biased samples (she didn’t study a broad cross-section of the population), subjective observations, and unstandardized measurements of people’s emotions across time (Bello-Hass, Bene, & Mitsumoto, 2002; Friedman & James, 2008). Admittedly, some people do pass through some or even all of the Kübler-Ross stages of dying, so there’s probably a grain of truth to her model that lends it a sense of credibility.
Yet research evidence suggests that many dying people don’t pass through her stages in the same order (Copp, 1998). Instead, people appear to cope with their “death sentences” in many ways. Studies of dying patients reveal that many skip Kübler-Ross stages, or even pass through them in reverse order (Buckman, 1993; Kastenbaum, 1998). Some people, for example, initially accept their own death, but then later enter denial (Bello-Hass et al., 2002). Moreover, the boundaries among Kübler-Ross’s stages are often blurry, and there’s minimal evidence for sudden “jumps” from one stage to another.
Some writers have also attempted to apply Kübler-Ross’s stages to the grief we experience following the death of a loved one, like a spouse or child (Friedman & James, 2008). Yet research doesn’t bear out the validity for her stages for this kind of grief either, as grieving people don’t all undergo the same fixed series of stages (Neimeyer, 2001). For one thing, not all people experience depression or marked distress following the loss of a loved one, including those about whom they care deeply (Bonanno et al., 2002; Wortman & Boerner, 2006; Wortman & Silver, 1989). Nor is there evidence that a failure to experience depres sion following a serious personal loss is indicative of poor mental adjustment (Wortman & Silver, 1989). Moreover, in one study of 233 people in Connecticut who’d recently lost a spouse, acceptance, not denial, was the predominant initial reaction following loss (Maciejewksi, Zhang, Block, & Prigerson, 2007). Acceptance continued to increase for the aver age widow or widower for 2 years following the loss.
Still other people may never fully accept the loss of their loved ones. In a study of people who’d lost a spouse or child in a motor vehicle accident, Darrin Lehman and his colleagues found that a large propor tion (anywhere from 30% to 85% depending on the questions asked) of grieving people were still struggling with getting over the loss 4 to 7 years later (Lehman, Wortman, & Williams, 1987). Many said that they’d still been unable to find meaning in the tragedy.
Are there any dangers of believing in the Kübler-Ross stages? We don’t know, but some grieving or dying people may feel pressured into coping with death in the sequence that Kübler-Ross described (Friedman & James, 2008). As Lehman and his colleagues noted, “When bereaved individuals fail to conform to these unrealistic expectations, others may convey that they are coping poorly or that this is indicative of serious psychological disturbance” (Lehman et al., 1987, p. 229). For example, one of the authors of your book (SJL) worked with a dying woman who felt guilt and resentment at being told by her friends that she needed to “accept” death, even though she was trying hard to continue to enjoy her life. Whether other patients experience the same apparent negative effects of belief in the Kübler-Ross stages is a worthy topic for future research.
Dying, it seems, just doesn’t follow the same path for all of us. There’s no uniform recipe for dying or grieving for others’ death, any more than there is for living, a point that even Kübler-Ross acknowledged in her final book: “Our grief is as individual as our lives” (Kübler-Ross & Kessler, 2005; p. 1). Yet it’s safe to say that for virtually all of us, death is some thing we’d prefer not to think about until we need to. As Woody Allen (1976) said, “I’m not afraid of dying. I just don’t want to be there when it happens.”
Chapter 2: Other Myths to Explore
Fiction | Fact |
A mother’s bad mood can lead to a miscarriage. | There’s no evidence that sadness or stress in mothers increases the odds of miscarriages. |
The first few minutes following birth are crucial for effective parent–infant bonding | There is no evidence that the first few minutes after birth are essential for effective bonds to develop. |
The first three years are especially critical to infant development. | There’s considerable reason to doubt that the first three years are much more crucial for most psychological functions than are later years. |
Children given a great deal of physical encouragement and support in walking walk earlier than other children. | The emergence of walking is influenced by children’s physical development, and is largely unaffected by parental encouragement. |
Newborn babies are virtually blind and deaf. | Newborns can see and hear many things. |
Infants establish attachment bonds only to their mothers. | Infants establish strong attachment bonds with their fathers and other significant household figures. |
Mothers who talk to their children in baby talk (“motherese”) slow down their language development. | Most evidence suggests that baby talk actually facilitates children’s language development. |
Children exposed prenatally to crack cocaine (“crack babies”) develop severe personality and neurological problems in later life. | Most children exposed to crack prenatally are largely normal in personality and neurological functioning. |
Young children almost never lie. | Many young children lie about important issues, including whether they’ve engaged in immoral behavior or have been sexually abused. |
Virtually all child prodigies “burn out” by adulthood. | Although some prodigies burn out, research shows that children with extremely high IQs have much higher levels of creative accomplishment in adulthood than other children. |
Overweight children are just carrying “baby fat” that will melt away as they grow older. | Obesity in children often persists for years. |
Adoption takes a negative psychological toll on most children. | Most adopted children are psychologically healthy. |
Children raised by gay parents have higher rates of homosexuality than other children. | Children raised by gay parents haven’t been found to exhibit higher levels of homosexuality than other children. |
Marital satisfaction increases after couples have children. | Marital satisfaction consistently plummets after couples first have children, although it typically rebounds. |
People need less sleep as they get older. | The elderly need just as much sleep as the young, although because less of their sleep is consumed by “deep sleep,” they tend to awaken often. |
A large percentage of the elderly lives in nursing homes. | Only 7–8% adults aged 75 or older live in nursing homes. |
Older people are more afraid of death than younger people. | The elderly report less fear of death, and more acceptance of death, than the young and middle aged. |
Almost all senile people suffer from Alzheimer’s disease. | Forty to fifty percent of people with dementia suffer from conditions other than Alzheimer’s disease, such as strokes and Pick’s disease. |
Excessive aluminum causes Alzheimer’s disease. | Controlled studies have found no support for this claim. |
Many people die of “old age.” | People die from accidents, violence, or disease, not from old age itself. |
Terminally ill people who’ve given up all hope tend to die shortly thereafter. | There’s no evidence for this belief. |
Terminally ill people can often “postpone” their deaths until after holidays, birthdays, or other personally significant days. | There’s no evidence for this belief, and perhaps even slight evidence that women with cancer are more likely to die right before their birthdays. |
Sources and Suggested Readings
To explore these and other myths about human development, see Bruer (1 999); Caldwell and Woolley (2008); Fiorello (2001); Furnham (1996); Kagan (1998); Kohn (1990); Mercer (2010); O’Connor (2007); Panek (1982); Paris (2000).