Brain Rule #12
We are powerful and natural explorers.
MY DEAR SON JOSH got a painful beesting at the tender age of 2, and he almost deserved it.
It was a warm, sunny afternoon. We were playing the “pointing game,” a simple exercise where he would point at something, and I would look. Then we’d both laugh. Josh had been told not to touch bumblebees because they could sting him; we used the word “danger” whenever he approached one. There, in a patch of clover, he spotted a big, furry, buzzing temptress. As he reached for it, I calmly said, “Danger,” and he obediently withdrew his hand. He pointed at a distant bush, continuing our game.
As I looked toward the bush, I suddenly heard a 110-decibel yelp. While I was looking away, Josh reached for the bee, which promptly stung him. Josh had used the pointing game as a diversion, and I was outwitted by a 2-year-old.
“DANGER!” he sobbed as I held him close.
“Danger,” I repeated sadly, hugging him, getting some ice, and wondering what puberty would be like in 10 years or so.
This incident was Dad’s inauguration into a behavioral suite often called the terrible twos. It was a rough baptism for me and the little guy. Yet it also made me smile. The mental faculties kids use to distract their dads are the same they will use as grown-ups to discover the composition of distant suns or the next alternative energy. We are natural explorers, even if the habit sometimes stings us. The tendency is so strong, it is capable of turning us into lifelong learners. But you can see it best in our youngest citizens, often when they seem at their worst.
Babies give researchers a clear view, unobstructed by years of contaminating experiences, of how humans naturally acquire information. Preloaded with lots of information-processing software, infants acquire information using surprisingly specific strategies, many of which are preserved into adulthood. In part, understanding how humans learn at this age means understanding how humans learn at any age.
We didn’t always think that way. If you had said something about preset brain wiring to researchers 40 years ago, their response would have been an indignant, “What are you smoking?” or, less politely, “Get out of my laboratory.” This is because researchers for decades thought that babies were a blank slate—a tabula rasa. They thought that everything a baby knew was learned by interactions with its environments, primarily with adults. This perspective undoubtedly was formulated by overworked scientists who never had any children. We know better now. Amazing strides have been made in understanding the cognitive world of the infant. Indeed, the research world now looks to babies to show how humans, including adults, think about practically everything.
Babies test everything—including you
Babies are born with a deep desire to understand the world around them and an incessant curiosity that compels them to aggressively explore it. This need for explanation is so powerfully stitched into their experience that some scientists describe it as a drive, just as hunger and thirst and sex are drives.
All babies gather information by actively testing their environment, much as a scientist would. They make a sensory observation, form a hypothesis about what is going on, design an experiment capable of testing the hypothesis, and then draw conclusions from the findings. They use a series of increasingly self-corrected ideas to figure out how the world works.
42 minutes old: Newborns can imitate
In 1979, Andy Meltzoff rocked the world of infant psychology by sticking out his tongue at a newborn and being polite enough to wait for a reply. What he found astonished him. The baby stuck her tongue back out at him! He reliably measured this imitative behavior with infants only 42 minutes old. The baby had never seen a tongue before, not Meltzoff’s and not her own, yet the baby knew she had a tongue, knew Meltzoff had a tongue, and somehow intuited the idea of mirroring. Further, the baby’s brain knew that if it stimulated a series of nerves in a certain sequence, she could also stick her tongue out. That’s definitely not consistent with the notion of tabula rasa.
I tried this with my son Noah. He and I started our relationship in life by sticking our tongues out at each other. In his first 30 minutes of life, we had struck up an imitative conversation. By the end of his first week, we were well entrenched in dialogue: Every time I came into his crib room, we greeted each other with tongue protrusions. It was purely delightful on my part and purely adaptive on his. If I had not stuck my tongue out initially, he would not be doing so with such predictability every time he saw me.
Three months later, my wife picked me up after a lecture at a medical school, Noah in tow. I was still fielding questions, but I scooped up Noah and held him close while answering. Out of the corner of my eye, I noticed Noah gazing at me expectantly, flicking his tongue out about every five seconds. I smiled and stuck my tongue out at Noah mid-question. Immediately he squealed and started sticking his tongue out with abandon, every half second or so. I knew exactly what he was doing. Noah made an observation (Dad and I stick our tongues out at each other), formed a hypothesis (I bet if I stick my tongue out at Dad, he will stick his tongue back out at me), created and executed his experiment (I will stick my tongue out at Dad), and changed his behavior as a result of the evaluation of his research (sticking his tongue out more frequently).
Nobody taught Noah, or any other baby, how to do this. And it is a lifelong strategy. You probably did it this morning when you couldn’t find your glasses, hypothesized they were in the bathroom, and went to look. From a brain science perspective, we don’t even have a good metaphor to describe how you know to do that. It is so automatic, you probably had no idea you were looking at the results of a successful experiment when you found your glasses lying on a towel.
Noah’s story is just one example of how babies use their precious preloaded information-gathering strategies to gain knowledge they didn’t have at birth. We also can see it in broken stuff, disappearing cups and temper tantrums.
12 months old: Infants analyze how objects act
Babies younger than a year old will systematically analyze an object with every sensory weapon at their disposal. They will feel it, kick it, try to tear it apart, stick it in their ear, stick it in their mouth, give it to you so that you can stick it in your mouth. They appear to be intensely gathering information about the properties of the object. Babies methodically do experiments on the objects to see what else they will do. In our household, this usually meant breaking stuff.
These object-oriented research projects grow increasingly sophisticated. In one set of experiments, babies were given a rake, and a toy was placed nearby. The babies quickly learned to use the rake to get the toy. This is not exactly a groundbreaking discovery, as every parent knows. After a few successful attempts, the babies lost interest in the toy. But not in the experiment. Again and again, they would take the toy and move it to a different place, then use the rake to grab it. You can almost hear them exclaiming, “Wow! How does this happen?”
18 months old: Objects still exist if you can’t see them
Little Emily, before 18 months of age, still believes that if an object is hidden from view, that object has disappeared. She does not have what is known as “object permanence.” That is about to change. Emily has been playing with a washcloth and a cup. She covers the cup with the cloth, and then pauses for a second, a concerned look on her brow. Slowly she pulls the cloth away from the cup. The cup is still there! She glares for a moment, then quickly covers it back up. Thirty seconds go by before her hand tentatively reaches for the cloth. Repeating the experiment, she slowly removes the cloth. The cup is still there! She squeals with delight. Now things go quickly. She covers and uncovers the cup again and again, laughing loudly each time. It is dawning on Emily that the cup has object permanence: Even if removed from view, it has not disappeared. She will repeat this experiment for more than half an hour. If you have ever spent time with an 18-month-old, you know that getting one to concentrate on anything for 30 minutes is some kind of miracle. Yet it happens, and to babies at this age all over the world.
Though this may sound like a delightful form of peekaboo, it is actually an experiment whose failure would have lethal evolutionary consequences. Object permanence is an important concept to have if you live in the savannah. Saber-toothed tigers still exist, for example, even if they suddenly duck down in the tall grass. Those who didn’t acquire this knowledge usually were on some predator’s menu.
18 months old: Your preferences aren’t the same as mine
The distance between 14 months of age and 18 months of age is extraordinary. Around 14 months, toddlers think that because they like something, the whole world likes the same thing—as summed up in the “Toddler’s Creed”:
If I want it, it is mine.
If I give it to you and change my mind later, it is mine.
If I can take it away from you, it is mine.
If we are building something together, all of the pieces are mine.
If it looks just like mine, it is mine.
If it is mine, it will never belong to anybody else, no matter what.
If it is yours, it is mine.
Around 18 months, it dawns on babies that this viewpoint may not always be accurate. They begin to learn that adage that most newlyweds have to relearn in spades: “What is obvious to you is obvious to you.”
How do babies react to such new information? By testing it, as usual. Before the age of 2, babies do plenty of things parents would rather them not do. But after the age of 2, small children will do things because their parents don’t want them to. The compliant little darlings seem to transform into rebellious little tyrants. Many parents think their children are actively defying them at this stage. (The thought certainly crossed my mind as I nursed Joshua’s unfortunate beesting.) That would be a mistake, however. This stage is simply the natural extension of a sophisticated research program begun at birth. You push the boundaries of people’s preferences, then stand back and see how they react. Then you repeat the experiment, pushing them to their limits over and over again to see how stable the findings are, as if you were playing peekaboo. Slowly you begin to perceive the length and height and breadth of people’s desires, and how they differ from yours. Then, just to be sure the boundaries are still in place, you occasionally do the whole experiment over again.
Babies may not have a whole lot of understanding about their world, but they know a whole lot about how to get it. It reminds me of the old proverb, “Catch me a fish and I eat for a day; teach me to fish and I eat for a lifetime.”
Babies reveal more of the brain’s secrets each year
Why does a baby stick its tongue back out at you? The beginnings of a neural road map have been drawn in the past few years, at least for some of the “simpler” thinking behaviors, such as imitation. Three investigators at the University of Parma were studying the macaque, assessing brain activity as it reached for different objects in the laboratory. The researchers recorded the pattern of neural firing when the monkey picked up a raisin. One day, researcher Leonardo Fogassi walked into the laboratory and casually plucked a raisin from a bowl. Suddenly, the monkey’s brain began to fire excitedly. The recordings were in the raisin-specific pattern, as if the animal had just picked up the raisin. But the monkey had not picked up the raisin. It simply saw Fogassi do it.
The astonished researchers quickly replicated and extended their findings, and then published them in a series of landmark papers describing the existence of “mirror neurons.” Mirror neurons are cells whose activity reflect their surroundings. Cues that could elicit mirror neural responses were found to be remarkably subtle. If a primate simply heard the sound of someone doing something it had previously experienced—say, tearing a piece of paper—these neurons could fire as if the monkey were experiencing the full stimulus. It wasn’t long before researchers identified human mirror neurons. These neurons are scattered across the brain, and a subset is involved in action recognition—that classic imitative behavior such as babies sticking out their tongues. Other neurons mirror a variety of motor behaviors.
We also are beginning to understand which regions of the brain are involved in our ability to learn from a series of increasingly self-corrected ideas. We use our right prefrontal cortex to predict error and to retrospectively evaluate input for errors. The anterior cingulate cortex, just south of the prefrontal cortex, signals us when perceived unfavorable circumstances call for a change in behavior. Every year, the brain reveals more and more of its secrets, with babies leading the way.
We never outgrow the desire to know
We can remain lifelong learners. No question. This fact was brought home to me as a postdoctoral scholar at the University of Washington. In 1992, Edmond Fischer and Edwin Krebs shared the Nobel Prize in Physiology or Medicine. I had the good fortune to be familiar with both their work and their offices. They were just down the hall from mine. By the time I arrived at the university, Fischer and Krebs were already in their mid-70s. The first thing I noticed upon meeting them was that they were not retired. Not physically and not mentally. Long after they had earned the right to be lounging on some tropical island, both had powerful, productive laboratories in full swing. Every day I would see them walking down the hall, oblivious to others, chatting about some new finding, swapping each other’s journals, and listening intently to each other’s ideas. Sometimes they would have someone else along, grilling them and in turn being grilled about some experimental result. They were as creative as artists, wise as Solomon, lively as children. They had lost nothing. Their intellectual engines were still revving, and curiosity remained the fuel. They taught me that our learning abilities don’t have to change as we age.
The brain remains malleable
Research shows that the brain is wired to keep learning as we age. Some regions of the adult brain stay as malleable as a baby’s brain, so we can grow new connections, strengthen existing connections, and even create new neurons, allowing all of us to be lifelong learners. We didn’t always think that. Until five or six years ago, the prevailing notion was that we were born with all of the brain cells we were ever going to get, and they steadily eroded in a depressing journey through adulthood to old age. We do lose synaptic connections with age. Some estimates of neural loss alone are close to 30,000 neurons per day. But the adult brain also continues creating neurons within the regions normally involved in learning. These new neurons show the same plasticity as those of newborns.
Throughout life, your brain retains the ability to change its structure and function in response to your experiences.
Why? Evolutionary pressure, as usual. Problem solving was greatly favored in the unstable environment of the Serengeti. But not just any kind of problem solving. When we came down from the trees to the savannah, we did not say to ourselves, “Good Lord, give me a book and a lecture and a board of directors so that I can spend 10 years learning how to survive in this place.” Our survival did not depend upon exposure to organized, preplanned packets of information. Our survival depended upon chaotic, reactive information-gathering experiences. That’s why one of our best attributes is the ability to learn through a series of increasingly self-corrected ideas. “The red snake with the white stripe bit me yesterday, and I almost died,” is an observation we readily made. Then we went a step further: “I hypothesize that if I encounter the same snake, the same thing will happen!” It is a scientific learning style we have exploited literally for millions of years. It is not possible to outgrow it in the whisper-short seven to eight decades we spend on the planet.
So it’s possible for us to continue exploring our world as we age. Of course, we don’t always find ourselves in environments that encourage such curiosity as we grow older. I’ve been fortunate to have a career that allowed me the freedom to pick my own projects. Before that, I was lucky to have my mother.
Encouraging curiosity with a passion
I remember, when I was 3 years old, obtaining a sudden interest in dinosaurs. I had no idea that my mother had been waiting for it. That very day, the house began its transformation into all things Jurassic. And Triassic. And Cretaceous. Pictures of dinosaurs would go up on the wall. I would begin to find books about dinosaurs strewn on the floor and sofas. Mom would even call dinner “dinosaur food,” and we would spend hours laughing our heads off trying to make dinosaur sounds. And then, suddenly, I would lose interest in dinosaurs, because some friend at school acquired an interest in spaceships and rockets and galaxies. Extraordinarily, my mother was waiting. Just as quickly as my whim changed, the house would begin its transformation from big dinosaurs to Big Bang. The reptilian posters came down, and in their places, planets would begin to hang from the walls. I would find little pictures of satellites in the bathroom. Mom even got “space coins” from bags of potato chips, and I eventually gathered all of them into a collector’s book.
This happened over and over again in my childhood. I got an interest in Greek mythology, and she transformed the house into Mount Olympus. My interests careened into geometry, and the house became Euclidean, then cubist. Rocks, airplanes. By the time I was 8 or 9, I was creating my own house transformations.
One day, around age 14, I declared to my mother that I was an atheist. She was a devoutly religious person, and I thought this announcement would crush her. Instead, she said something like “That’s nice, dear,” as if I had just declared I no longer liked nachos. The next day, she sat me down by the kitchen table, a wrapped package in her lap. She said calmly, “So I hear you are now an atheist. Is that true?” I nodded yes, and she smiled. She placed the package in my hands. “The man’s name is Friedrich Nietzsche, and the book is called Twilight of the Idols,” she said. “If you are going to be an atheist, be the best one out there. Bon appetit!”
I was stunned. But I understood a powerful message: Curiosity itself was the most important thing. And what I was interested in mattered. I have never been able to turn off this fire hose of curiosity.
Most developmental psychologists believe that a child’s need to know is a drive as pure as a diamond and as distracting as chocolate. Even though there is no agreed-upon definition of curiosity in cognitive neuroscience, I couldn’t agree more. I firmly believe that if children are allowed to remain curious, they will continue to deploy their natural tendencies to discover and explore until they are 101. This is something my mother seemed to know instinctively.
For little ones, discovery brings joy. Like an addictive drug, exploration creates the need for more discovery so that more joy can be experienced. It is a straight-up reward system that, if allowed to flourish, will continue into the school years. As children get older, they find that learning brings them not only joy but also mastery. Expertise in specific subjects breeds the confidence to take intellectual risks. If these kids don’t end up in the emergency room, they may end up with a Nobel Prize.
I believe it is possible to break this cycle, anesthetizing both the process and the child. By first grade, for example, children learn that education means an A. They begin to understand that they can acquire knowledge not because it is interesting, but because it can get them something. Fascination can become secondary to “What do I need to know to get the grade?” But I also believe the curiosity instinct is so powerful that some people overcome society’s message to go to sleep intellectually, and they flourish anyway.
My grandfather was one of those people. He was born in 1892 and lived to be 101 years old. He spoke eight languages, went through several fortunes, and remained in his own house (mowing his own lawn) until the age of 100. He was lively as a firecracker to the end. At a party celebrating his centenary, he took me aside. “You know, Juanito,” he said, clearing his throat, “sixty-six years separate the Wright brothers’ airplane from Neil Armstrong and the moon.” He shook his head, marveling. “I was born with the horse and buggy. I die with the space shuttle. What kind of thing is that?” His eyes twinkled. “I live the good life!”
He died a year later.
I think of him a lot when I think of exploration. I think of my mother and her magically transforming rooms. I think of my youngest son experimenting with his tongue, and my oldest son’s overwhelming urge to take on a beesting. And I think that we must do a better job of encouraging lifelong curiosity, in our workplaces, our homes, and especially in our schools.
More ideas
On a personal level, what this tells us is to follow our passions. But I would also like to see change on a broader scale so that our environments truly support our individual efforts to remain curious.
Free time at work
Smart companies take to heart the power of exploration. For example, companies such as 3M, Genentech, and Google allowed employees to use 15 or 20 percent of their workweek to go where their mind asks them to go. The proof is in the bottom line: At Google, fully 50 percent of new products—including Gmail, Google News, and AdSense—came from “20 percent time.” Facebook, LinkedIn, and other tech companies hold “hackathons”: marathon programming sessions where coders can earn prizes for creating something interesting.
Schools where you learn on the job
If you could step back in time to one of the first Western-style universities, say, the University of Bologna, and visit its biology labs, you would laugh out loud. I would join you. By today’s standards, biological science in the 11th century was a joke, a mix of astrological influences, religious forces, dead animals, and rude-smelling chemical concoctions. But if you went down the hall and peered inside Bologna’s standard lecture room, you wouldn’t feel as if you were in a museum. You would feel at home. There is a lectern for the teacher to hold forth, surrounded by chairs for the students to absorb whatever is being held forth—much like today’s classrooms. Could it be time for a change?
Some people have tried to harness our natural exploratory tendencies by using “problem-based” or “discovery-based” learning models. What’s missing are empirical results that show the long-term effects of these styles. To this end, I would like to see more degree programs modeled after medical schools. The best medical-school model has three components: a teaching hospital; faculty who work in the field as well as teach; and research laboratories. It is a surprisingly successful way of transferring complex information from one brain to another. Students get consistent exposure to the real world, by the third year spending half of their time in class and half learning on the job. They are taught by people who actually do what they teach as their “day job.” And they get to participate in practical research programs.
Here’s a typical experience in medical school: The clinician-professor is lecturing in a traditional classroom setting and brings in a patient to illustrate some of his points. The professor announces: “Here is the patient. Notice that he has disease X with symptoms A, B, C, and D.” He then begins to lecture on the biology of disease X. While everybody is taking notes, a smart medical student raises her hand and says, “I see symptoms A, B, C, and D. What about symptoms E, F, and G?” The professor looks a bit chagrined (or excited) and responds, “We don’t know about symptoms E, F, and G.” You can hear a pin drop at those moments, and the impatient voices whispering inside the students’ heads are almost audible: “Well, let’s find out!” These are the opening words of most of the great research ideas in human medicine.
That’s true exploratory magic. The tendency is so strong that you have to deliberately cut off the discussions to keep the ideas from forming. Rather than cutting off such discussions, most American medical schools possess powerful research wings. By simple juxtaposition of real-world needs with traditional book learning, a research program is born.
I envision a college of education where the program is all about brain development. Like a medical school, it is divided into three parts. It has traditional classrooms. It is a community school staffed and run by three types of faculty: traditional education faculty who teach the college students, certified teachers who teach the little ones attending the community school, and brain scientists who run the research labs devoted to a single purpose: investigating how the human brain learns in teaching environments, then actively testing hypothesized ideas in real-world classroom situations.
Students would get a bachelor of science in education. Future educators are infused with deep knowledge about how the human brain acquires information. After their first year of study, students would start actively participating in the on-site school.
This model honors our evolutionary need to explore. It creates teachers who know about brain development. And it’s a place to do the real-world research so sorely needed to figure out how, exactly, the rules of the brain should be applied to our lives. The model could apply to other academic subjects as well. A business school teaching how to run a small business might actually run one, for example.
A student could create a version of this learning experience on her own, by seeking out internship opportunities while in school.
My 2-year-old son Noah and I were walking down the street on our way to preschool when he suddenly noticed a shiny pebble embedded in the concrete. Stopping midstride, the little guy considered it for a second, found it thoroughly delightful, and let out a laugh. He spied a small plant an inch farther, a weed valiantly struggling through a crack in the asphalt. He touched it gently, then laughed again. Noah noticed beyond it a platoon of ants marching in single file, which he bent down to examine closely. They were carrying a dead bug, and Noah clapped his hands in wonder. There were dust particles, a rusted screw, a shiny spot of oil. Fifteen minutes had passed, and we had gone only 20 feet. I tried to get him to move along, having the audacity to act like an adult with a schedule. He was having none of it. And I stopped, watching my little teacher, wondering how long it had been since I had taken 15 minutes to walk 20 feet.
The greatest Brain Rule of all is something I cannot prove or characterize, but I believe in it with all my heart. As my son was trying to tell me, it is the importance of curiosity. For his sake and ours, I wish classrooms and companies were designed with the brain in mind. If we started over, curiosity would be the most vital part of both demolition crew and reconstruction crew. As I hope to have related here, I am very much in favor of both.
I will never forget the moment my little professor taught his daddy about what it meant to be a student. I was thankful and a little embarrassed. After 47 years, I was finally learning how to walk down the street.
Brain Rule #12
We are powerful and natural explorers.
• Babies are the model of how we learn—not by passive reaction to the environment but by active testing through observation, hypothesis, experiment, and conclusion.
• Specific parts of the brain allow this scientific approach. The right prefrontal cortex looks for errors in our hypothesis (“The saber-toothed tiger is not harmless”), and an adjoining region tells us to change behavior (“Run!”).
• We can recognize and imitate behavior because of “mirror neurons” scattered across the brain.
• Some parts of our adult brains stay as malleable as a baby’s so that we can create neurons and learn new things throughout our lives.