7
Making Memories …
The Workings of Working Memory
Have you ever wondered about the length of telephone numbers? When the engineers working for the Bell Telephone Company created the modern phone system in the 1950s, they had to consider a variety of factors. If the numbers were too short, there wouldn’t be enough of them to go around. If they were too long, people would make mistakes when they used them. (Remember that this was when calling someone meant repeatedly turning a dial with your index finger. It could take several seconds to dial a number, so mistakes were costly in terms of time wasted.) Most importantly, however, if the numbers were too long, people wouldn’t be able to remember them. But how long is too long?
Let’s try a little memory experiment. Before reading any further, hand this book to a family member or friend, and ask her to follow the instructions shown below:
Please read the following numbers out loud. Speak the numbers at a rate of about four per second, and try to keep the pauses between each number the same length:
3 7 2 9 5 8 1 6 0 2 7 4
Immediately after you finish, ask that the numbers be repeated out loud. Now hand the book back (and thank you!).
This is called a digit span task. Chances are, you weren’t able to repeat the entire twelve-digit sequence. Most people are able to remember the first few numbers, but then, somewhere around the middle of the list, their memory collapses like a house of cards. Whatever the span of one’s memory is, it seems to be fewer than twelve numbers.
Cognitive scientists have made use of this digit span test for a variety of purposes, but we’ll just focus on estimates of its size for now. George Miller, who was in fact one of the very first cognitive scientists, famously referred to the number of items that can be held in memory as “the magical number seven, plus or minus two.”1 And in fact, the engineers at Bell Labs made use of Miller’s research when they decided that seven digits offered the best balance between phone number length and people’s memory limitations.
But just as some people claim that age is simply a number, it turns out that digit span is rather arbitrary as well. A moment ago, you probably failed in your attempt to recall a twelve-digit number that you’d just heard. Now we’re going to give you another twelve-digit number, and ask you to recall it. And we confidently predict that you’ll do much better with this one (here’s a hint: think about dates from history):
1 4 9 2 1 7 7 6 2 0 0 1
How did you do? If you realized that this twelve-digit sequence is composed of the years of three important events in American history, then you could think about it as:
1492, 1776, and 2001
This sequence isn’t just a meaningless string of digits. The first four digits are also the year of Columbus’s discovery of the New World, the second four correspond to the year America declared independence from Britain, and the final four will always be connected with the year of the 9/11 terrorist attacks.
This would seem to contradict Miller’s claim that the normal amount of information we can remember using the digit span task is seven, or nine at most. However, think about what you did for this second example: instead of passively listening to your friend, as in the first case, you imposed meaning on the numbers. And this makes all the difference. Miller called this chunking. So one’s digit span isn’t seven plus or minus two items; it’s seven plus or minus two chunks. In his paper, Miller provided an elegant simile: short-term memory (as measured by digit span) is like a purse that can hold seven coins. However, the coins can be copper, or they can be gold.
If you think about chunking, then, phone numbers aren’t really seven numbers long. This is because area codes are meaningful, not just random. For example, in the television show Seinfeld, Elaine was upset because she wanted to keep the traditional Manhattan area code of 202, rather than one of the newer numbers. For Elaine, and others, the area code 202 means Manhattan. So if an area code is thought of as one meaningful chunk, instead of three separate numbers, then even telephone numbers with area codes are still comfortably within a normal person’s digit span ability of five to nine chunks.
Digit span is important for our purposes because it provides a way of measuring a person’s short-term or working memory. And working memory is a key component of language comprehension. Spoken language isn’t produced all at once: a speaker articulates words one by one over time, until his thought is complete. And when we read, our eyes jump from point to point on a line of text as we decode the words individually or in groups. In either case, it’s essential to hang on to the first part of a sentence until the last words are encountered.
Working memory size is affected by many factors, such as intelligence (people with higher IQs perform better on digit span tests) and one’s mood (clinically depressed individuals perform worse). However, another major factor is one’s age. Memory span appears to increase during childhood and then plateau in the late teenage years. After the age of twenty, researchers have documented a steady decline, at least as measured by traditional digit span techniques.2 So one factor that potentially makes language learning harder for adults is the gradual diminution of one’s ability to hold several things in mind at once. This loss, while far from ideal, may not be as problematic as it first appears. Adults possess more general world knowledge than children, so they can employ chunking far more effectively. Age may adversely affect one’s digit span, but knowledge and experience make it easy to compensate for this decline by making sense out of these numbers.
What does all this mean for language learning? Often in a language class, students are asked to listen to a dialogue or spoken text and then repeat back verbatim what they heard. This is a difficult task in the best of circumstances, and with age, the task becomes more difficult. In fact, even when people are asked to do this in their native language, they are often unable to do so. Native speakers will paraphrase what they hear—being true to the meaning of the phrase, even if they don’t use the exact same words.
Therefore, when language learners try to memorize and then repeat long parts of a text verbatim, they are actually testing their working memory, rather than developing linguistic competence. Such oral drills and memorization exercises discriminate against the adult language learner. “The adult learns best not by rote, but by integrating new concepts and material into already existing cognitive structures.”3
This is not to say that language students do not need to memorize anything. And it is also not to say that listening comprehension is unimportant. Of course, students will need to memorize vocabulary words and phrases. This is especially true for idiomatic expressions (for example, letting the cat out of the bag can’t really be paraphrased as releasing the feline from the sack). But rote memorization of dialogue and text is a cognitively demanding task that will most likely frustrate the adult language learner. Rather than focusing on exercises that primarily tax working memory, we suggest that adult language learners acquire new vocabulary, grammatical structures, and idiomatic expressions by focusing on meaning. Learning to chunk seemingly disconnected words into meaningful units, and focusing on meaning through the use of paraphrasing, will make the time spent studying more effective.
As you may have guessed, the story of working memory is a bit more complicated than described so far. In fact, researchers are still debating the exact size of working memory.4 But is a container metaphor the best way to conceptualize working memory in the first place?
The British psychologist Alan Baddeley and others began to suspect that working memory was more than just a temporary repository for things you’ve heard or seen. These researchers began a program of research in the 1970s that continues to the present day. Through a series of studies, they demonstrated that instead of being a monolithic structure, working memory actually consists of a number of cognitive subcomponents, the most important of which for our purposes is called the central executive.5
As we saw earlier, one way of thinking about working memory is to conceptualize it as a purse that can hold a limited number of coins. Baddeley, in contrast, conceptualized it as a workbench—a place where mental contents can be actively manipulated. Information from long-term memory can be called up from storage and brought into working memory to help with the task at hand (much like how you used your knowledge of American history to recognize significant years in our earlier example). Moving information to and from long-term memory is one of the roles of the central executive.
As you’ve gotten older, you may have noticed that you’re more easily distracted by competing demands on your attention. For example, you may start one task, say, unloading the dishwasher, and then get distracted by a phone call or a televised news report playing in a different room. And after you’ve finished your phone call or watching the news story, you may have forgotten all about your original goal of putting away the plates and silverware in the kitchen. This kind of thing can happen at any age, of course, but research suggests that a culprit for those in middle age may be a decline in the central executive’s ability to deal with competing information.6 Just as a business executive might become harried as a horde of underlings make demands for attention and decisions, the central executive may find itself juggling too many tasks, and this can lead to making errors or forgetting.
Research suggests that the efficacy of the central executive reaches its peak during one’s twenties, and declines thereafter, although perhaps not as much as previously thought.7 This has important implications for the adult language learner. By its very nature, language production involves several cognitive processes unfolding at once. When you are speaking, for example, you must keep track of what you’re trying to say, retrieving the appropriate words from memory and monitoring your listener’s face for signs of comprehension or confusion. Although this process may seem almost effortless in one’s native language, when speaking a nonnative language, the cognitive load (the amount of information that must be processed to complete the task) can severely tax the central executive.
Changes in the central executive also have implications for the process by which a new language is learned. Minimizing distractions and attention switching during study can decrease cognitive load. It’s all too easy to check one’s e-mail while completing a language exercise on the computer, but it would probably be best to avoid this kind of temptation. All of us seem to believe that we are efficient multitaskers, but the truth is that our ability to multitask is not as great as we think, and this ability does decline over time.8 Finally, most foreign language materials are designed for high school and college students, and so they may be less appropriate for someone in their forties or fifties: the multimedia bells and whistles that are used to appeal to a younger audience may simply be distracting and unhelpful.
Deep Thoughts
Our ability to remember our previous experiences can be quite impressive most of the time, but as we all know, it can also be rather fickle. Why is it, for example, that we can have trouble remembering something important, like where we parked the car, and yet be able to effortlessly remember the lyrics of songs that we haven’t heard in years, and don’t even like? Why do some things seem to “stick” in memory, while others do not?
An important part of the story may be how we think about the information that we later try to remember. According to an approach called depth of processing, one determinant of later memory is the mental operations that we perform as we learn something. In a classic experiment, Craik and Tulving asked participants questions about words that they were being shown. For example, participants might see the word cloud and be asked, “Is the word printed in capital letters?” or “Does the word rhyme with weight?” Such questions can be answered based on the superficial characteristics of the words themselves (how they’re printed, how they sound), and without reflection on the words’ underlying meaning. Therefore, only what is called shallow processing is required to answer the question.9
However, for other words, the participants in the study had no choice but to reflect on deeper aspects of the concepts that the words represented. To continue our example, some participants who saw the word cloud were asked “Is the word a type of fish?” while others were asked “Would the word fit the sentence ‘He met a _________ in the street?’” It’s impossible to answer these types of questions without reflecting, at least to some degree, on the conceptual characteristics of clouds (“they’re up in the sky, not swimming in a lake or walking around on the ground”).
After exposing their participants to a series of such words and questions, the researchers presented them with a set of words, and asked them to identify those that they had seen during the first phase of the study. Craik and Tulving predicted that participants’ recall would be based on the type of task they had engaged in: Those who had been asked about cloud in the deeper conditions should have better memory for the word cloud than the participants who had been asked about cloud in the more shallow conditions.
As predicted, there was a robust depth of processing effect. For the participants who thought about whether the word had been printed in capital letters, memory was quite poor—these words were recognized, on average, just 16 percent of the time. At the other extreme, when participants were asked whether the word fit in a particular sentence, recognition accuracy was impressively high—90 percent of the words presented in that condition were recognized.
Although the depth of processing approach is not without its critics, cognitive scientists still draw upon it as a useful conceptual framework.10 And it has important implications for the study of a new language. For example, many students believe that reading aloud in a foreign language improves their speaking and reading ability, as well as their expressive fluency. And while this may be helpful to some degree, it should be clear that this is a shallow task. Since the students are focusing almost exclusively on how to correctly pronounce the words, they aren’t processing the texts deeply, and their memory for the vocabulary and the content of these passages will probably be quite poor.
In a similar way, the act of parroting back what has just been heard in a rote memory task is also shallow. It would be much better, as a deeper task, to paraphrase what you’ve heard, because in this way, you must grapple directly with the meaning of the words, and not just what they sound like.
Finally, some students believe that writing a word over and over creates “muscle memory,” which may lead to superior retention for this word. Once again, however, such repetition is inherently shallow and fails to make contact with the deeper levels of processing that will create a more durable representation in long-term memory. Breaking the word apart into its meaningful components, for example, would be a deeper task. So a student studying German and encountering the word Schadenfreude would be well advised to try and recognize its component parts (Schaden = “to hurt,” Freude = “joy”) in order to learn and remember the concept’s meaning (taking pleasure in the misfortune of others), rather than merely writing it repeatedly.
Allow Me to Elaborate
The contrast between shallow and deep processing has implications for two different strategies that people use to try to remember information. Back when people had to call the operator for a telephone number, if they didn’t have a pencil handy, they listened to the number and then just repeated it over and over: “555-1212, 555-1212, 555-1212” until the last digit was dialed. This worked just fine, unless the line was busy or the person didn’t answer. In that case, they had to call the operator back to ask for the number again. (When Richard did this he would try to disguise his voice.)
Obviously, this strategy, which is known as maintenance rehearsal, is a very ineffective way to retain any information, and yet that is precisely how many people try to learn new information. They simply cycle it through working memory, never processing it more deeply. It’s no wonder, therefore, that it fades so quickly.
In contrast, elaborative rehearsal strategies allow one to process information at a deeper level, more effectively transferring information from working memory into long-term memory. Elaborative rehearsal strategies include focusing on meaning. For example, to memorize vocabulary words, rather than simply maintaining them in working memory through repetition, better, more elaborative strategies would include paraphrasing, thinking about how the word connects to other words in your vocabulary, or thinking about how the word relates to yourself. Although you may end up studying fewer words per day, with elaborative rehearsal, the ones you do study will be more meaningful, and therefore more likely to be remembered and used correctly.
In addition, when rehearsing elaboratively, don’t forget to take into account the zone of proximal development that was discussed in chapter 2. As you review information, such as vocabulary, grammatical structures, or idiomatic expressions, some items will seem ripe for you to learn. Pick this low-hanging fruit, if you will, and incorporate, through elaboration, the new material into what you already know. With this new, expanded knowledge, you’ll have prepared yourself to tackle even more advanced material and expanded your zone of proximal development. In short, elaborate on what you know. As Ausubel famously advised: “The most important single factor influencing learning is what the learner already knows. Ascertain this and teach him accordingly.”11
Learning versus Relearning
Some of you may be reading this book because you want to relearn a foreign language that you studied previously. Perhaps twenty or thirty years ago, you studied a language in high school or college and would like to become fluent in that language now. But after such a long gap between the first time you tried to learn that language and now, could restarting on the language really be considered relearning? After all, it may seem like you’ve forgotten everything you ever knew about that language, and that trying to relearn it after thirty years would essentially mean starting from scratch. But is that really the case?
As it turns out, some of the very first studies ever conducted on human memory were about relearning. In the early 1880s, a German researcher named Herman Ebbinghaus examined the processes of learning and forgetting. This may sound a bit peculiar at first—you either know something or you don’t, right? However, that isn’t really the case. If you run into an old acquaintance on the street, you may be able to recognize him (“I know that face is familiar”), but you may not be able to recall how you know that person, or what his name is. In other words, memory includes both recognition and recall. In the case of recognition, all that’s required is some feeling of familiarity (“I know that I used to know this person”). Recognition memory is typically excellent, even after several decades. Recall is harder, because it requires actually reproducing information, such as your acquaintance’s name.
Ebbinghaus is famous for having proposed that there is also a third way of measuring memory.12 There’s recall and recognition, but there is also relearning. Ebbinghaus reasoned that if you are able to memorize something that you once used to know faster than you can memorize something that you’ve never learned before, then something must have been kept in your long-term memory—even if you cannot consciously recall it (like your acquaintance’s name). We’re going to describe one of Ebbinghaus’s experiments in some detail, because it’s historically important, ingenious in design, and directly relevant to our original question about relearning something like one’s high school French.
To begin with, of course, Ebbinghaus needed to find something to learn. He didn’t want to memorize meaningful material, like passages from books, because he was concerned that prior knowledge of the topic or associations to other material might make the study’s results difficult to interpret. Instead, he invented and used an entirely new type of memory stimulus: the nonsense syllable. A nonsense syllable is simply a random combination of a consonant, a vowel, and another consonant, like baf, zup, or tej. These three-letter sequences aren’t words in English (or German, for that matter), but they are wordlike in that they are pronounceable and can be memorized as if they were words. Importantly for Ebbinghaus, nonsense syllables have no prior associations, and therefore can be used as a measure of “pure” memory. Ebbinghaus created several hundred of these nonsense syllables, and inscribed them on cards to be used in his studies. (In a way, Ebbinghaus’s method of memorizing nonsense syllables from cards is very much like foreign language learners’ method of trying to memorize unfamiliar vocabulary words using index cards.)
Over a period of two years, Ebbinghaus conducted more than 160 trials of his experiment, using himself as a subject. Here is how a trial in Ebbinghaus’s experiment unfolded. He selected one of his decks of cards at random (let’s say it’s deck no. 23), noted the time, and began to study the nonsense syllables. His goal was to learn them well enough to be able to recite the list from memory twice without making any mistakes. If he made a mistake, he would go back to studying the syllables until he was ready to attempt his recitation again. Eventually, he would achieve his goal, and he wrote down how long it took, which he called the original learning time.
Later, he would try to relearn the same list. He varied the amount of time that elapsed between the original learning and this relearning. The shortest interval was twenty minutes, and the longest was an entire month. In our example, let’s imagine that a week has gone by, and he is trying to relearn deck 23. He did this in the exact same way as he had learned the deck the first time—and recorded the time it took. This was now a measure of his relearning time for the list.
As you probably have guessed by now, Ebbinghaus was an extremely dedicated and careful researcher. (He must have also been someone who didn’t get bored very easily.) After two years of patient memorization and recitation, he had enough data to describe the relearning process.
Ebbinghaus quantified his performance by subtracting the relearning time from the original learning time, and then converting this number to a percentage, which he called his savings. When he looked at the data he had collected, he found that most forgetting takes place almost immediately (see figure 7.1). This forgetting curve shows that just twenty minutes after learning a list of nonsense syllables perfectly, his savings was only about 60 percent. After an hour, this number falls to about 44 percent. And after nine hours, the savings was down to 36 percent.
Figure 7.1
Ebbinghaus’s forgetting curve.
Now, if that trend were to continue, the result should be that Ebbinghaus would have forgotten everything he had learned in just a couple of days. But that isn’t what happened. As you can see by looking at the chart, the amount that Ebbinghaus forgot, after dropping dramatically at first, began to level off. After one day, his savings was down to about 34 percent. After two days, it had dropped to about 28 percent. And after six days, the savings had fallen to 25 percent. After that, declines were negligible all the way to the longest time interval that Ebbinghaus used: at 31 days after learning a particular deck of nonsense syllables, he still had 21 percent savings (see figure 7.1).
Although other psychologists went on to replicate Ebbinghaus’s research in different settings and with different types of materials, they all found a similar result—that although most forgetting takes place soon after learning, the material that does remain is available, even over long periods of time. Larry Squire and Pamela Slater, for example, studied participants’ ability to recognize the names of TV programs and racehorses that they had been exposed to during a fifteen-year period (from the late 1950s to the early 1970s).13 As Ebbinghaus would have predicted, these subjects displayed a gradual rate of forgetting over the years after having learned the information.
What should we make of this result? Like much of the research we discuss in this book, there is both good news and bad news. The bad news is that the process of forgetting kicks in immediately after we’re exposed to something, and like sand through our fingers, most of this information just slips away. The good news is that this rate of forgetting slows considerably as time goes by. And keep in mind that this is just one way of measuring forgetting. As we will see, recognition memory can be excellent even many decades after learning something.14 In addition, Ebbinghaus used nonsense syllables in his research. And although vocabulary words in a foreign language may seem like nonsense syllables at first, they eventually become associated with concepts and therefore stop being just “nonsense,” which also makes them easier to relearn.
The encouraging conclusion for the adult language learner is that any previous exposure to a foreign language can be helpful when relearning that language. Although you may think that you have no memory for any of the vocabulary of a foreign language you learned in high school or college, that experience means that you’ll be able to relearn those vocabulary words faster than someone who has never been exposed to that language.
Cognitive Overload
Earlier in this chapter we mentioned the concept of cognitive load. As you may recall, cognitive load refers to how much information can be manipulated in working memory at a given time. We also discussed the ways in which researchers have tried to quantify exactly how much information can be effectively processed in working memory. Although estimates vary, one can easily see that the complex task of learning a new language places huge cognitive demands on working memory. When the cognitive load imposed on working memory becomes too great, we end up in a state of cognitive overload, which means that a person is no longer able to use working memory effectively to accomplish the task at hand. Moreover, to compensate for cognitive overload, a person may focus on simple aspects that are related to the task, but which do not actually add to learning in any meaningful way. For example, if you are listening to a person speak very rapidly in your target language, you may stop focusing on trying to understand what the person is saying, and instead focus on how they are saying it—such as paying attention to their accent or gestures. Clearly, cognitive overload interferes with learning.
Unfortunately, everyone learning a new language will experience cognitive overload. When this happens, it’s important to recognize the situation for what it is and not to give up or blame yourself, the language, or the teacher. There are ways to manage the cognitive load that comes with learning a new language.
Cognitive Overload from Factors Internal to Language
Cognitive overload can happen because language is inherently complex. Although not much can be done to alter the internal workings of a particular language, it is still possible to manage the demands placed on working memory that come from learning a language.
One way to manage the cognitive load in language learning tasks is to break the task down into smaller subunits that are easier to process mentally. For example, reading Japanese is notoriously difficult. But the task of reading Japanese can be subdivided. Because Japanese has borrowed more than 20,000 words from English—loanwords such as conbini for convenience store and beddo for bed—teachers of Japanese often start by having students learn these words. In Japanese, borrowed words are written in a special script called katakana. Katakana is the easiest of the three Japanese scripts for English speakers to learn because once they can sound out the word, they often (but not always) know what it means. In terms of cognitive load, learning katakana is less taxing on working memory than learning hiragana (which is used for purely Japanese words), or kanji, which are symbols based on Chinese characters.
But even with kanji, teachers can reduce the cognitive load of this task by showing students how to break apart the character into its component parts, which have somewhat consistent meanings across the characters in which they are found. This is the technique advocated by James Heisig in his book Remembering the Kanji. For example, the kanji for fortune telling is made up of two elements: “mouth” and “divining rod.”15 This is an example of managing the cognitive load that is specific to the language itself. To use Miller’s terminology, the kanji for fortune telling has been transformed from five individual brush strokes into only two chunks, thereby increasing the capacity of short-term, or working, memory.
Cognitive Overload from Factors External to Language
Cognitive overload also occurs when the way the material is presented imposes too great a demand on working memory. With any instructional technique, therefore, it is crucial to consider how each aspect of the technique itself requires something of working memory, and then reduce or eliminate those parts of the task that impose too high a cognitive demand. External factors that increase cognitive load include time constraints, motivation, distractions, and other situational factors that are not part of the language itself.
How much or how little cognitive load can be imposed before cognitive overload results follows a pattern referred to as the Yerkes–Dodson law. The Yerkes–Dodson law describes how, for easily accomplished tasks, adding cognitive load (or what we could also call arousal or pressure) is actually helpful in accomplishing a task. In other words, performance on easy tasks actually improves when external factors are imposed.16
For example, imagine that you are driving on Interstate 80 in Nebraska on a beautiful spring day. The road is dry, smooth, and there is very little traffic. What do you do? If you are like many people, you are likely to add cognitive load to this relatively easy task by singing along with the radio, or talking to your companion, or listening to a book on tape. In fact, because the task is so easy, if you didn’t add any external complexity to it, your mind would start to wander and you could possibly fall asleep at the wheel. Therefore, adding extrinsic cognitive load is crucial to helping you accomplish the task of driving effectively.
But imagine now that traffic has suddenly gotten heavier, the sky has darkened, and rain is coming down so heavily you can hardly see the road. What do you do? The first thing you might do is to turn off the radio or tell your companion to be quiet—even though neither of those actions will make the rain stop or cause the traffic to clear up. But you do it anyway because the task of driving the car has just gotten more difficult, so you need to get rid of any unnecessary demands on your attention. If the radio kept blaring and your friend kept talking you might go into a state of cognitive overload with potentially disastrous results.
Here’s another example. In the theater it is a generally held superstition that if a dress rehearsal is great, then the actual performance will be awful. But if a dress rehearsal is awful, then the performance will be great. Why? According to the Yerkes–Dodson law, if the actors and stagehands are very well prepared, then the dress rehearsal isn’t stimulating and without the added arousal of an audience, they don’t really try hard. The next night when the audience is in attendance, the added external arousal is exciting and they perform quite well. In other words, if you know something well you do better in front of a group than you do practicing alone.
Alternatively, if the dress rehearsal goes well without an audience, then the presence of an audience might add too much pressure, and cognitive overload may be the result. Therefore, if you don’t know something well, you perform it better alone than you do in front of a group.
The same is true for language learning. As an adult language learner, it is important for you to control how much cognitive load you can tolerate before reaching a state of cognitive overload. If a task is easy for you, then applying it in ways that that add complexity will help you perform better. If the task is difficult, then it is important that you find ways to reduce or eliminate anything that adds unnecessary cognitive demands to the task.
Factors that increase cognitive load are often built into language-learning tasks on purpose to ensure mastery of the material. For example, teachers may give timed tests because the added pressure of a time limit will show how much of the students’ language ability has become automatic. Timed tests can actually enhance the performance of speakers with a high level of proficiency, but they can impair the performance for those whose linguistic abilities are still shaky. Therefore, the addition of cognitive load cannot be said to be either good or bad. How a person responds to the additional cognitive demands placed on a task depends on the task itself, the cognitive strategy used to perform the task, and the individual’s level of mastery.17
Interpersonal factors also place burdens on available cognitive resources. Even something as routine as being polite requires extra mental processing that can unintentionally cause confusion or misunderstanding.18 In a similar way, if you are an outgoing person who is always looking for opportunities to meet new people, then a scavenger hunt in which you approach strangers in a shopping center and engage them in conversation will seem easy and fun, and therefore, you will not feel cognitively overloaded when you add to the complexity of the task by trying to engage them in your target language. However, if this is exactly the kind of activity you despise, then the added stress (e.g., overcoming one’s natural shyness, or feeling embarrassed) could send you into a state of cognitive overload. In this case, the unnecessary external element (a scavenger hunt where you are forced to interact with unsuspecting strangers) artificially imposes such a huge emotional load on the linguistic task (which is, in fact, the main task) that it overloads the person’s working memory capacity and very little learning will actually take place.
When designing or evaluating language learning tasks, it is important to think about the cognitive load required for the task. Try to separate the cognitive demands into those that are intrinsic to the language itself from those that are being imposed externally. No specific language learning task is right or wrong in terms of cognitive load—but some will impose higher external cognitive demands on working memory than others, which may, depending on the person, enhance the learning experience or create a state of cognitive overload.
The Time-Traveling Intruder
If you’ve had to change your telephone number in the recent past, you may have experienced difficulties in remembering it at first. As you attempted to recall the sequence to answer a friend’s query, your old phone number may have popped unbidden into your mind, and it may have taken some effort to recall your new number. This annoying phenomenon is a well-understood principle of memory called proactive interference. The idea is that old learning can get in the way of, and interfere with, your ability to recall things you’ve learned more recently. The more similar the old and new learning, the more likely it is that you will experience interference.19
Numerous laboratory studies have demonstrated the power of proactive interference—and most such studies are conducted on college-age students, so even younger adults have trouble with this. In a typical experiment, participants are asked to memorize a list of similar items. This list (we’ll call it list A) might consist of words like lion, giraffe, and elephant. After learning list A, participants are asked to memorize a second list (list B), which contains items like zebra, antelope, and gazelle. After learning list B, participants are given another unrelated task, and after a few moments are asked to recall list B. As you might imagine, this can be a challenging task, because of the similarities of the words on both lists (animals that inhabit the savannas of Africa). Just like the old telephone number, the animals from list A intrude into the participants’ awareness, making it more difficult for them to recall the animals from list B. Now imagine what might happen if participants are asked to learn list C, which also contains the names of African animals. The results aren’t pretty!
Interestingly, interference effects can occur in reverse as well. If someone asks you for your current phone number, you can probably supply it readily enough. But if they were also to ask you to recall your previous phone number, you might experience some difficulty. In this case, the new learning—your current phone number—is causing problems for the old learning—the previous number. This is referred to as retroactive interference. Like a character in a bad science fiction movie, the new learning can travel back in time and make life difficult for information that you’ve learned previously.
As a foreign language learner, it’s important to understand what causes interference effects, and what to do when you experience them. If you’ve been studying a list of Spanish nouns and find yourself making many mistakes as you attempt to recall them, you’re likely to feel frustration. Instead of giving up, however, you would be well advised to shift your study to a different set of words, like adjectives or verbs, or to a different task entirely, such as grammar. When you return to the list of nouns, you may find that your recall has improved. Researchers call this improved performance in memory after a change in study material release from proactive interference. This suggests that it is wise to study different materials over shorter periods of time rather than one type of material over a longer period. For example, rather than study vocabulary for thirty minutes and then grammar for thirty minutes, it may better to alternate them every fifteen minutes and then take a break.
It may seem perverse that the knowledge you’ve taken pains to acquire in the past can sabotage your efforts to learn new material in the present or the future. However, that’s viewing the glass as half empty. The fact that you experience interference means that you already possess a great deal of information in long-term memory. You’re not just an adult language learner—you’re a knowledgeable adult language learner. The trick is to make this prior learning work for you, not against you. First of all, be thankful that you have this prior knowledge and be glad that all of it hasn’t been forgotten. Over time, information stored in long-term memory does fade, but no matter how much you may feel you have forgotten, as we mentioned earlier in this chapter, relearning is always faster than learning. If you studied Spanish in high school and twenty years later you decide to start studying Spanish again, you clearly have an advantage over the person who has never studied it, even if you don’t think you do.
But let’s say that you’d like to study a different language now. At first, you might experience some mild proactive interference. For example, the previously learned Spanish might interfere with the new language. As you continue to study the new language, acquiring more and more linguistic information, cultural knowledge and contextual cues, previously learned information will interfere less and less. Since you cannot actively forget information the same way you actively learn it, interference is normal and should be expected. Therefore, rather than berate yourself for the interference, leverage your metacognitive skills to exploit your previous knowledge and experience in service of learning the new language. For example, if you previously studied Spanish, but now you are studying Italian, apply what you know about Romance languages in general to Italian when and where appropriate. If you are now studying Chinese, there will be fewer ways to search out and benefit from such linguistic commonalities; however, because the languages themselves are so different, there is likely to be less interference anyway.
Because we are constantly learning new information throughout our lives, it is not surprising that older adults experience more interference on memory tasks than younger adults. The fact that it happens more frequently as we age merely means we’ve learned a lot more information. But even here, there is good news. Lisa Emery, Sandra Hale, and Joel Myerson found the expected increase in interference for older adults, but they also demonstrated that both younger and older adults showed complete release from proactive interference.20 So when you begin to make more errors when studying vocabulary or grammar, just keep calm and carry on—but with a different task.