brain rule
Feeling safe enables learning
Nothing in President Theodore Roosevelt’s early life suggested even a whiff of future greatness. He was a sickly child, nervous and timid, and so asthmatic that he had to sleep upright in bed to keep from asphyxiating. He was too ill to attend formal classes, forcing his parents to school him at home. Because of a serious heart condition, his doctors suggested he find a line of work that would tether him to a desk and by all means avoid strenuous physical activity.
Fortunately, Roosevelt’s mind did not cooperate with either his body or his doctor. Possessed of a voracious intellect, a photographic memory, and a ceaseless need to achieve, he wrote his first scientific paper (“The Natural History of Insects”) at the age of 9. He was accepted to Harvard at the age of 16, graduated Phi Beta Kappa, ran for the state legislature at age 23, and published his first scholarly book, a history of the War of 1812, the next year. He gained a reputation as a thought-provoking historian and, eventually, an able politician. And zoologist. And philosopher, geographer, warrior, and diplomat. Roosevelt became commander in chief at the age of 42, the youngest ever. He remains the only president awarded the Congressional Medal of Honor, and he was the first American to win the Nobel Peace Prize.
What made Roosevelt so darn smart, given his less than auspicious start? Clearly, genetics helped our 26th president. For all of us, nature controls about 50 percent of our intellectual horsepower, and environment determines the rest. This means two things for parents: First, no matter how hard your child tries, there will be limits to what his brain can do. Second, that’s only half of the story. Aspects of your child’s intelligence will be deeply influenced by his environment, especially by what you do as parents. We’ll look at both the seed and the soil. This chapter discusses the biological basis of a child’s intelligence. The next chapter explains what you can do to optimize it.
What a smart brain looks like
If you could peer inside your baby’s brain, would there be clues to her future intellectual greatness? What does intelligence look like in the twists and folds of the brain’s convoluted architecture? One obvious, if ghoulish, way to answer these questions is to look at the brains of smart people after they have died and seek clues to intelligence in their neural architecture. Scientists have done this with a variety of famous brains, from the mathematical German Carl Gauss to the not-so-mathematical Russian Vladimir Lenin. They’ve studied Albert Einstein’s brain, too, with surprising results.
Just your average genius
Einstein died in New Jersey in 1955. His autopsy was performed by Thomas Stoltz Harvey, who must go down as the most possessive pathologist in history. He excised the famous physicist’s brain and photographed it from many angles. Then he chopped the brain into tiny blocks. Then he got into trouble. Harvey apparently did not secure permission from Einstein or his family to pixellate the physicist’s famous brain. Princeton Hospital administrators demanded that Harvey surrender Einstein’s brain. Harvey refused, forfeited his job, fled to Kansas, and held the preserved samples for more than 20 years. From time to time, Harvey would send researchers tantalizing bits of Einstein’s brain for analysis. Finally, Harvey decided to give Einstein’s brain, or at least what was left of it, back to the chief of pathology at Princeton Hospital. Now the tissues could be subject to more systematic study, with scientists looking for clues that would reveal Einstein’s genius.
What did they discover? The most surprising finding was that there was nothing surprising. Einstein had a fairly average brain. The organ had a standard internal architecture, with a few structural anomalies. The regions responsible for visuospatial cognition and math processing were a bit larger, 15 percent fatter than average. He was also missing some sections that less agile brains possess, accompanied with a few more glial cells than most people carry (glial cells help give the brain its structure and assist with information processing). None of these results are very instructive, unfortunately. Most brains possess structural abnormalities: some regions more shrunken than others, some more swollen. Because of this individuality, it is currently impossible to demonstrate that certain physical differences in brain structure lead to genius. Einstein’s brain certainly was smart, but not one of its dice-sized pieces would definitively tell us why.
What about looking at live, functioning brains? These days, you don’t have to wait until someone is dead to determine structure-function relationships. You can use noninvasive imaging technologies to look in on the brain while it is performing some task. Can we detect smartness in people by observing an organ caught in the act of being itself? The answer, once again, is no. Or at least not yet. When you examine living geniuses solving some tough problem, you do not find reassuring similarities; you find disconcerting individualities. Problem solving and sensory processing do not look the same in any two brains.
This has led to great confusion and contradictory findings. Some studies purport to show that “smart” people have more efficient brains (they use less energy to solve tough problems), but other researchers have found exactly the opposite. Gray matter is thicker in some smart people, white matter thicker in others. Scientists have found 14 separate regions responsible for various aspects of human intelligence, sprinkled throughout the brain like cognitive pixie dust. These magical regions are nestled into an idea called P-FIT, short for Parieto-Frontal Integration Theory. When P-FIT regions are examined as people think deep thoughts, researchers again find that, frustratingly, overarching patterns are few. Different people use varying combinations of these regions to solve complex problems. These combinations probably explain the wide variety of intellectual abilities we can observe in people.
We have even less information about a baby’s intelligence. It is very difficult to do noninvasive imaging experiments with the diaper-and-pull-up crowd. To do a functional MRI, for example, the head needs to stay perfectly still for long stretches of time. Good luck trying to do that with a wiggly 6-month-old! Even if you could, given our current understanding, brain architecture cannot successfully predict whether your child is going to be smart.
In search of a “smart gene”
How about at the level of DNA? Have researchers uncovered a “smart gene”? A lot of people are looking. Variants of one famous gene, called COMT (catechol-O-methyl transferase, since you asked), appear to be associated with higher short-term-memory scores in some people, though not in others. Another gene, cathepsin D, also was linked to high intelligence. So was a variant of a dopamine receptor gene, from a family of genes usually involved in feeling pleasure. The problem with most of these findings is that they have been difficult to replicate. Even when they have been successfully confirmed, the presence of the variant usually accounted for a boost of only 3 or 4 IQ points. To date, no intelligence gene has been isolated. Given the complexity of intelligence, I highly doubt there is one.
Bingo: A baby IQ test
If cells and genes aren’t any help, what about behaviors? Here, researchers have struck gold. We now have in hand a series of tests for infants that can predict their IQs as adults. In one test, preverbal infants are allowed to feel an object hidden from their view in a box. If the infants can then correctly identify the object by sight—called cross-modal transfer—they will score higher on later IQ tests than infants who can’t. In another test, measuring something researchers call visual recognition memory, infants are set in front of a checkerboard square. This is an oversimplification, but the longer the infants stare, the higher their IQ is likely to be. Sound unlikely? These measurements, taken between 2 and 8 months of age, correctly predicted IQ scores at age 18!
What does that really mean? For one thing, it means that when these children reach school age, they will do well on an IQ test.
The intelligence of IQ
IQ matters a lot to some people, such as the admissions officers of elite private kindergarten and elementary schools. They often demand that children take intelligence tests; the WISC-IV, short for Wechsler Intelligence Scale for Children, Version Four, is common. Many schools accept only those kids who score in the ridiculously high 97th percentile. These $500 tests are sometimes administered to 6-year-olds, or kids even younger, serving as an entrance exam to kindergarten!
Here are two typical questions on IQ tests:
1. Which one of the five is least like the other four? Cow, tiger, snake, bear, dog.
Did you say snake? Congratulations. The testers who designed the question agree with you (all the other animals have legs; all the others are mammals).
2. Take 1000 and add 40 to it. Now add another 1000. Now add 30. And another 1000. Now add 20. Now add another 1000. Now add 10. What is the total?
Did you say 5,000? If so, you’re in good company. Research shows that 98 percent of people who tackle this question get that answer. But it is wrong. The correct answer is 4,100.
IQ tests are filled with questions like these. If you get them right, does that mean you are smart? Maybe. But maybe not. Some researchers believe IQ tests measure nothing more than your ability to take IQ tests. The fact is, researchers don’t agree on what an IQ test measures. Given the range of intellectual abilities that exist, it is probably smart to reject a one-number-fits-all notion as the final word on your baby’s brain power. Armed with a little history on these inventories, you can decide for yourself.
The birth of the IQ test
Many sharp folks have investigated the definition of human intelligence, often in an attempt to figure out their own unique gifts. One of the first was Francis Galton (1822–1911), half cousin to Charles Darwin. Possessed with enormous and fashionable pork-chop sideburns but otherwise balding, Sir Francis was stern, brilliant, and probably a little crazy. He came from a famous line of pacifist Quakers whose family business was, oddly enough, the manufacture of guns. Galton was a prodigy, reading and quoting Shakespeare by the time he was 6, and speaking both Greek and Latin at an early age. He seemed to be interested in everything, as an adult making contributions to meteorology, psychology, photography, and even criminal justice (he advocated for the scientific analysis of fingerprints to identify criminals). Along the way, he invented the statistical concepts of standard deviation and linear regression, and he used them to study human behavior.
One of his chief fascinations concerned the engines that power human intellect—especially inheritance. Galton was the first to realize that intelligence both had heritable characteristics and was powerfully influenced by environment. He’s the one who coined the phrase “nature versus nurture.” Because of these insights, Galton is probably the man most responsible for inspiring scientists to consider the definable roots of human intelligence. But as researchers began to investigate the matter systematically, they developed a curious compulsion to describe human intelligence with a single number. Tests were used—and are still used today—to yield such numbers. The first one is our oft-mentioned IQ test, short for intelligence quotient.
IQ tests originally were designed by a group of French psychologists, among them Alfred Binet, innocently attempting to identify children who might need help in school. The group devised 30 tasks that ranged from touching one’s nose to drawing patterns from memory. The design of these tests had very little empirical support in the real world, and Binet consistently warned against interpreting these tests literally. He felt presciently that intelligence was quite plastic and that his tests had real margins for errors. But German psychologist William Stern began using the tests to measure children’s intelligence, quantifying the scores with the term “IQ.” The score was the ratio of a child’s mental age to his or her chronological age, multiplied by 100. So a 10-year-old who could solve problems normally solved only by 15-year-olds had an IQ of 150: (15/10) × 100. The tests became very popular in Europe, then floated across the Atlantic.
In 1916, Stanford professor Lewis Terman removed some of the questions and added new ones—also without many empirical reasons to do so. The configuration has been christened the Stanford-Binet test ever since. Eventually, the ratio was changed to a number distributed along a bell curve, setting the average to 100. A second test, developed in 1923 by British Army officer-turned-psychologist Charles Spearman, measured what he called “general cognition,” now simply referred to as “g.” Spearman observed that people who scored above average on one subcategory of pencil-and-paper tests tended to do well on the rest of them. This test measures the tendency of performance on a large number of cognitive tasks to be intercorrelated.
Battles have been raging for decades about what these test scores mean and how they should be used. That’s a good thing, because intelligence measures are more plastic than many people realize.
Gaining and losing a pound of IQ
I remember the first time I saw the actress Kirstie Alley on-screen, playing a smart, sexy character in a Star Trek movie. A former cheerleader, Kirstie went on to star in a number of television shows, including the role for which she won two Emmys, the legendary sitcom Cheers. But she may be better known for her issues with weight. In 2005, Kirstie reportedly weighed more than 200 pounds, mostly because of poor eating habits. She became a spokesperson for a weight-loss program and at one point starred in a television show about an overweight actress attempting to get work in Hollywood. She eventually lost 75 pounds. Since then, however, her weight has continued to fluctuate.
What does this unstable number have to do with our discussion of intelligence? Like Kirstie’s dress size, IQ is malleable. IQ has been shown to vary over one’s life span, and it is surprisingly vulnerable to environmental influences. It can change if one is stressed, old, or living in a different culture from the testing majority. A child’s IQ is influenced by his or her family, too. Growing up in the same household tends to increase IQ similarities between siblings, for example. Poor people tend to have significantly lower IQs than rich people. And if you earn below a certain income level, economic factors will have a much greater influence on your child’s IQ than if your child is middle class. A child born in poverty but adopted into a middle-class family will on average gain 12 to 18 points in IQ.
There are people who don’t want to believe IQ is so malleable. They think numbers like IQ and “g” are permanent, like a date of birth instead of a dress size. The media often cast our intellectual prowess in such permanent terms, and our own experience seems to agree. Some people are born smart, like Theodore Roosevelt, and some people are not. The assumption is reassuringly simplistic. But intelligence isn’t simple, and neither is our ability to measure it.
You’d be smart to reject a one-number-fits-all notion as the final word on your baby’s brain power.
Smarter every year
One damning piece of evidence is the fact that somehow IQs have been increasing for decades. From 1947 to 2002, the collective IQ of American kids went up 18 points. James Flynn, a crusty, wild-haired old philosopher from New Zealand, discovered this phenomenon (a controversial finding cheerfully christened the “Flynn Effect”). Flynn set up the following thought experiment. He took the average American IQ of 100, then ran the numbers backward from the year 2009 at the observed rate. He found that the average IQ of Americans in 1900 would have been between 50 and 70. This is the same score of most people with Down syndrome, a classification termed “mild mental retardation.” Most of our citizens at the turn of the century did not have Down syndrome. So is there something wrong with the people or something wrong with the metric? Clearly, the notion of IQ permanence needs some retooling.
I certainly believe in the concept of intelligence, and I think IQ and “g” assess some aspect of it. So do many of my colleagues, who signed a 1997 editorial in the research journal Intelligence declaring that “IQ is strongly related, probably more so than any other single measurable human trait, to many important educational, occupational, economic and social outcomes.” I agree. I just wish I knew what was actually being measured.
What does it mean to be smart?
The variability of these IQ tests can be frustrating. Parents want to know if their kid is smart. And they want their kid to be smart. Given our knowledge-based 21st-century economy, that makes sense. When you drill down on the subject, however, many parents really mean that they want their kids to be academically successful, which is a better guarantee of their future. Are “smart” and “grade-point average” related? They are, but they are not the same thing, and the link is not as strong as one might think.
Single numbers—or even correlations between single numbers—simply do not have enough flexibility to describe the many complexities of human intelligence. Harvard psychologist Howard Gardner, who published his latest theory of multiple intelligences in 1993, put it this way: Strong evidence exists “that the mind is a multifaceted, multicomponent instrument, which cannot in any legitimate way be captured in a single paper-and-pencil-style instrument.” Ready to cry uncle? Is intelligence going to be the province of comments like “I don’t know what it is, but I know when I see it”? No, but to see the issue more clearly, we are going to have to replace this one-number-fits-all notion.
Human intelligence is more like ingredients in a stew than numbers in a spreadsheet.
Mom’s beef stew: Five ingredients of intelligence
The smell of my mother’s beef stew simmering in the kitchen on a cold winter’s day is easily the best comfort-food memory I have. The crackling sounds of braised beef; the sweet, stinging smell of chopped onions; the delightful sight of quarter-sized medallions of carrots floating in a Crock-Pot … Mom’s stew was like a warm hug in a bowl.
She once marched me into the kitchen to teach me how to make her famous beef stew. No easy task, this, for she had the annoying habit of changing the recipe almost every time she made it. “It depends on who’s coming over for dinner,” Mom would explain, “or whatever we have lying around the house.” According to her, only two elements were critical to pull off her masterpiece. One was the quality of the beef. The other was the quality of the gravy surrounding the meat. If those issues were settled, the stew was going to be a success, regardless of what else went into the pot.
Like Mom’s stew, human intelligence has two essential components, both fundamentally linked to our evolutionary need to survive. The first is the ability to record information. This is sometimes called “crystallized intelligence.” It involves the various memory systems of the brain, which combine to create a richly structured database.
The second component is the capacity to adapt that information to unique situations. This involves the ability to improvise, based in part on the ability to recall and recombine specific parts of the database. This capacity for reasoning and problem solving is termed “fluid intelligence.” From an evolutionary perspective, the potent combination of memorization and extemporization conferred on us two survival-rich behaviors: the ability to learn rapidly from our mistakes and the ability to apply that learning in unique combinations to the ever-changing, ever-brutal world of our East African cradle.
Intelligence, seen through this evolutionary lens, is simply the ability to do these activities better than someone else.
Mandatory as memory and fluid intelligence are, though, they are not the entire recipe for human smarts. Just like my mother’s shifting recipe, different families have different combinations of talents stewing in their cerebral Crock-Pots. One son might have a poor memory but dynamite quantitative skills. One daughter might display an extraordinary penchant for language yet remain mystified by even simple division. How can we say one child is less intelligent than the other?
Many ingredients make up the human intelligence stew, and I’d like to describe five that I think you would do well to consider as you contemplate your child’s intellectual gifts. They are:
• The desire to explore
• Self-control
• Creativity
• Verbal communication
• Interpreting nonverbal communication
Most of these characteristics fall outside the spectrum of the usual IQ suspects. We believe many have genetic roots; most can be seen in newborns. Rooted as these five ingredients may be in our evolutionary history, however, they do not exist in isolation from the outside world. Nurture—even for Teddy Roosevelt—plays an important role in whether a child is able to maximize his or her intelligence.
1. The desire to explore
This is one of my favorite examples of an infant’s penchant for exploration. I was attending the Presbyterian baptism of a 9-month-old. Things started out well enough. The infant was nestled quietly in his dad’s arms, waiting for his turn to be sprinkled in front of the congregation. As the parents turned to face the pastor, the baby spied the handheld microphone. He quickly tried to wrest the mike out of the pastor’s grip, flicking his tongue out at the ball of the microphone. The little guy seemed to think that the mike was some kind of ice cream cone, and he decided to test his hypothesis.
This was highly inappropriate Presbyterian behavior. The pastor swung the microphone out of reach and immediately realized his mistake: Even in the preverbal crowd, hell hath no fury like a scientist denied his data. The baby howled, tried to wiggle free, and clawed at the microphone, all while licking at the air. He was exploring, darn it, and he did not appreciate being interrupted in the pursuit of knowledge. Especially if it involved sugar.
I’m not sure about the parents, but I was delighted to see such a fine example of pediatric research enthusiasm. Parents have known that children were natural scientists long before there were microphones. But it wasn’t until the last half of the 20th century that we could isolate components of their wonderful exploratory behaviors.
Thousands of experiments confirm that babies learn about their environment through a series of increasingly self-corrected ideas. They experience sensory observations, make predictions about what they observe, design and deploy experiments capable of testing their predictions, evaluate their tests, and add that knowledge to a self-generated, growing database. The style is naturally aggressive, wonderfully flexible, and annoyingly persistent. They use fluid intelligence to extract information, then crystallize it into memory. Nobody teaches infants how to do this, yet they do it all over the world. This hints at the behavior’s strong evolutionary roots. They are scientists, as their parents suspected all along. And their laboratory is the whole world, including microphones in church.
An innovator’s DNA
Exploratory behavior—the willingness to experiment, to ask extraordinary questions of ordinary things—is a talent highly prized in the working world, too. Good ideas tend to make money. The trait seems to be as valuable a survival strategy today as it was on the plains of the Serengeti.
What traits separate creative, visionary people who consistently conjure up financially successful ideas from less imaginative, managerial types who carry them out? Two business researchers explored that simple question. They conducted a whopping six-year study with more than 3,000 innovative executives, from chemists to software engineers. After being published in 2009, the study won an award from Harvard Business Review.
Visionaries had in common five characteristics, which the researchers termed “Innovator’s DNA.” Here are the first three:
• An unusual ability to associate. They could see connections between concepts, problems, or questions not obvious to others.
• An annoying habit of consistently asking “what if.” And “why not” and “how come you’re doing it this way.” These visionaries scoured out the limits of the status quo, poking it, prodding it, shooting upward to the 40,000-foot view of something to see if it made any sense and then plummeting back to earth with suggestions.
• An unquenchable desire to tinker and experiment. The entrepreneurs might land on an idea, but their first inclination would be to tear it apart, even if it was self-generated. They displayed an incessant need to test things: to find the ceiling of things, the basement of things, the surface area, the tolerance, the perimeters of ideas—theirs, yours, mine, anybody’s. They were on a mission, and the mission was discovery.
The biggest common denominator of these characteristics? A willingness to explore. The biggest enemy was the non-exploration-oriented system in which the innovators often found themselves. Hal Gregersen, one of the lead authors of the study, said in Harvard Business Review: “You can summarize all of the skills we’ve noted in one word: ‘inquisitiveness.’ I spent 20 years studying great global leaders, and that was the big common denominator.” He then went on to talk about children: “If you look at 4-year-olds, they are constantly asking questions. But by the time they are 6½ years old, they stop asking questions because they quickly learn that teachers value the right answers more than provocative questions. High-school students rarely show inquisitiveness. And by the time they’re grown up and are in corporate settings, they have already had the curiosity drummed out of them. Eighty percent of executives spend less than 20 percent of their time on discovering new ideas.”
That’s a heartbreaker. Why we’ve designed our schools and workplaces this way has never made sense to me. But you, as a parent, can encourage your child’s natural desire to explore—starting with understanding how inquisitiveness contributes to your child’s intellectual success.
2. Self-control
A healthy, well-adjusted preschooler sits down at a table in front of two giant, freshly baked chocolate-chip cookies. It’s not a kitchen table—it’s Walter Mischel’s Stanford lab during the late 1960s. The smell is heavenly. “You see these cookies?” Mischel says. “You can eat just one of them right now if you want, but if you wait, you can eat both. I have to go away for five minutes. If I return and you have not eaten anything, I will let you have both cookies. If you eat one while I’m gone, the bargain is off and you don’t get the second one. Do we have a deal?” The child nods. The researcher leaves.
What does the child do? Mischel has the most charming, funny films of children’s reactions. They squirm in their seat. They turn their back to the cookies (or marshmallows or other assorted caloric confections, depending on the day). They sit on their hands. They close one eye, then both, then sneak a peek. They are trying to get both cookies, but the going is tough. If the children are kindergartners, 72 percent cave in and gobble up the cookie. If they’re in fourth grade, however, only 49 percent yield to the temptation. By sixth grade, the number is 38 percent, about half the rate of the preschoolers.
Welcome to the interesting world of impulse control. It is part of a suite of behaviors under the collective term “executive function.” Executive function controls planning, foresight, problem solving, and goal setting. It engages many parts of the brain, including a short-term form of memory called working memory. Mischel and his many colleagues discovered that a child’s executive function is a critical component of intellectual prowess.
Executive function is a better predictor of academic success than IQ.
We now know that it is actually a better predictor of academic success than IQ. It’s not a small difference, either: Mischel found that children who could delay gratification for 15 minutes scored 210 points higher on their SATs than children who lasted one minute.
Why? Executive function relies on a child’s ability to filter out distracting (in this case, tempting) thoughts, which is critical in environments that are oversaturated with sensory stimuli and myriad on-demand choices. That’s our world, as you have undoubtedly noticed, and it will be your children’s, too. Once the brain has chosen relevant stimuli from a noisy pile of irrelevant choices, executive function allows the brain to stay on task and say no to unproductive distractions.
At the neurobiological level, self-control comes from “common value signals” (measures of neural activity) generated by a specific area of the brain behind your forehead. It is called—brain jargon alert—the ventromedial prefrontal cortex. Another area of the brain, the dorsolateral prefrontal cortex, throws out bolts of electricity to this ventromedial cousin. The more practice a child has in delaying gratification, the better aimed the jolt becomes, and the more control it can exert over behavior. Researchers originally discovered this while having diet-conscious adults look at pictures of carrots, then switching the picture to candy bars. Their brains exerted powerful “I-don’t-care-if-it’s-sugar-you-can’t-have-any” signals when the chocolate appeared.
A child’s brain can be trained to enhance self-control and other aspects of executive function. But genes are undoubtedly involved. There seems to be an innate schedule of development, which explains why the cookie experiment shows a difference in scores between kindergartners and sixth graders. Some kids display the behaviors earlier, some later. Some struggle with it their entire lives. It’s one more way every brain is wired differently. But children who are able to filter out distractions, the data show, do far better in school.
3. Creativity
My mother’s favorite artist in the world was Rembrandt. She was enraptured by his use of light and space, which transported her effortlessly into his 17th-century world. She was much less enamored of 20th-century art. I remember her railing about Marcel Duchamp’s Fountain—simply a urinal—being placed in the same artistic firmament as her beloved van Rijn. Toilets as art? And she hated it? For me as an 11-year-old boy, that was artistic Valhalla!
Mom, to whom I owe every atom of curiosity I possess, reacted with her typical parental insight and grace: She set aside her own preferences and followed my curiosity. She brought home two pictures wrapped in brown paper and sat me down. “Imagine,” she began, with just a hint of eye rolling, “that you tried to express in two dimensions all the information of a three-dimensional object. How would you do it?” I stumbled around trying to get the right answer, or any answer, but made no progress. Mom interrupted. “Perhaps you would come up with something like this!” With the flourish of an actress, which she briefly was, Mom ripped off the wrapping, revealing prints of Picasso masterpieces: Three Musicians and Violin and Guitar. It was love at first cube. Not to take anything away from Rembrandt, but Three Musicians was a revelation to me, as was the creative mind that conceived it.
Why did I think that? How does anyone recognize creativity? It is a tough question, saturated in cultural subjectivity and individual experience, as the differences between me and my mother showed. Researchers do believe that creativity has a few core components, however. These include the ability to perceive new relationships between old things, to conjure up ideas or things or whatever that do not currently exist. (Attempts to depict 3-D in a 2-D world come to mind.) Creativity also must evoke emotions, positive or negative, in someone else. Something—a product, a result—has to come of the process. And it involves a healthy dose of risk taking. It took a lot of guts to make a painting of musicians that looked as if they had exploded. It took a lot of guts to plop down a urinal in a 1917 New York show and call it art.
Human creativity involves many groups of cognitive gadgets, including episodic memory and autobiographical memory systems. Like a TiVo recording a sitcom, these systems permit the brain to keep track of events happening to you, allowing you a reference to your personal experiences in both time and space. You can recall going to the grocery store and what you bought there, not to mention the idiot who stubbed your heel with a grocery cart, because of these episodic memory systems. They are separate from the memory systems that allow you to calculate the sales tax of your purchase, or even remember what a sales tax is. But that’s not all episodic systems do.
Scientist Nancy Andreasen found that these TiVos are recruited when innovative people start associating connectively—making the insightful connections across seemingly unrelated notions that allow them to create. The TiVos reside in brain regions called association cortices, which are huge in humans—the biggest of any primate—stretching out like cobwebs across the frontal, parietal, and temporal lobes.
A second set of findings associates creativity with risk taking. This is not the kind of foolishness where you as an undergraduate ate two 16-inch pizzas in one sitting because someone named Tom-Tom dared you (don’t ask). Abnormal risk taking, which is also associated more with substance abuse and bipolar mania, does not make you more creative. There is a type of risk taking that does, however, and the research community calls it “functional impulsivity.” Researchers uncovered two separate neural processing systems that manage functional impulsivity. One governs low-risk, or “cold,” decision-making behaviors; the other governs high-risk, or “hot,” decision-making behaviors. A cold decision might involve a child agreeing to go to a favorite restaurant with a friend. A hot decision might involve ordering the nuclear inferno chili appetizer on the friend’s dare.
With all the crazy things children do, how can we tell functional impulsivity from abnormal risk taking? Unfortunately, there is no test that can distinguish “productive” from “stupid” in kids (or adults, for that matter).
Research on risk shows some sex-based differences. Boys are less cautious, for example. The difference starts showing up in the second year of life, and then things really ramp up: Boys are 73 percent more likely than girls to die from accidents between birth and puberty, and they break rules more often. But in recent decades, the sex-based differences have begun to shrink, perhaps because of changing expectations. Separating nature from nurture is darned hard with issues like these.
Whatever their gender, creative entrepreneurs have functional-impulsivity instincts in spades. They score atmospherically high on tests that measure risk taking, and they have a strong ability to cope with ambiguity. When their brains are caught in the act of being creative, the medial and orbital sectors of the prefrontal cortex, regions just behind the eyes, light up like crazy on an fMRI. More “managerial types” (that’s actually what researchers call them) don’t have these scores—or these neural activities.
Can you predict creativity in kids? Psychologist Paul Torrance created a 90-minute exam called the Torrance Tests of Creative Thinking. The tests are composed of some truly delightful problems. Children might be presented with a picture of a stuffed rabbit, then told they have three minutes to improve upon the design to make it more fun to play with. They might be presented with a scribble, then told to make a narrative from it. Torrance first administered the exam in 1958 to several hundred children, then followed their lives into adulthood, assessing their creative output throughout: things like patents filed, books written, papers published, grants awarded, and businesses started. The study is still ongoing, the participants christened “Torrance’s Kids.” Torrance died in 2003, and the study is now supervised by colleagues.
As a research tool, the exam has been formally evaluated many times. Though the test is not without its critics, the most amazing finding remains how well a child’s scores predict his or her future creative output. Indeed, the scores predict lifetime creative output with a correlation three times stronger than native IQ can predict. The test has been translated into 50 languages and taken by millions of people. It is the go-to standard for evaluating creativity in children.
4. Verbal communication
The most memorable experience in my rookie year of parenting our younger son, Noah, was the moment he said his first multisyl-lable word. Noah’s first six months had been a fountain of joy for our family. He is a glass-half-full kind of kid, with a smile as effervescent as root beer and a laugh like bubble bath, and Noah approached his language skills with the same joy. He possessed a particular preoccupation with sea creatures, which I blame in equal parts on Finding Nemo and National Geographic. We put pictures of sea animals on the ceiling above his changing table, including a cartoon of a giant red Pacific octopus. He had not yet said any full words at the half-year mark, but he was about to.
One morning I was busy changing his diaper, just before work. Noah suddenly stopped smiling and just stared straight at the ceiling as I cleaned him up. Slowly, deliberately, he pointed his finger upward, turned his gaze from the ceiling, looked me straight in the eye, and said in a clear voice: “Oct-o-pus.” Then he laughed out loud. He pointed at it again, said it louder: “OCT-O-PUS,” and giggled. I almost had a heart attack. “Yes!” I cried, “OCTOPUS!” He replied, “Octo, octo, octopus,” laughing now. We both chanted it. I forgot what I was doing the rest of that morning—I think I called in sick for work—and we had a dance that day, celebrating all things eight-legged. Other words came in rapid succession in the days following. (So did my absenteeism.)
You can’t argue with the fact that verbal skills are important in human intelligence. They even make it into IQ tests. One of the seminal joys for any parent is to watch a child grapple with this unique human talent in the first months of life. What happened in Noah’s brain that made so many things come together at once on that changing table—or in any other child’s brain as language dawns on her like a sunrise? We don’t really know. Many theories abound about how we acquire language. Famed linguist Noam Chomsky believes we are born with language software preloaded into our heads, a package he calls universal grammar.
Once language acquisition gets going, it tends to develop quickly. Within a year and a half, most kids can pronounce 50 words and understand about 100 more. That figure explodes to 1,000 words by a child’s third birthday and to 6,000 just before the sixth birthday. Calculated from birth, we acquire new words at the rate of three per day. This project takes a long time to finish. English will require the mastery of about 50,000 words, and that doesn’t even include idioms and fixed expressions like “hitting a home run” or “pot of gold.” It’s pretty complex stuff. On top of vocabulary, children have to learn the sounds of the language (phonemes) and the social meaning of the words (affective intent).
Infants track these characteristics of language at an astonishingly early age. At birth, your baby can distinguish between the sounds of every language that has ever been invented. Professor Patricia Kuhl, co-director of the Institute for Learning and Brain Sciences at the University of Washington, discovered this phenomenon. She calls kids at this age “citizens of the world.” Chomsky puts it this way: We are not born with the capacity to speak a specific language. We are born with the capacity to speak any language.
Foreign languages
That doesn’t last. By their first birthday, Kuhl found, babies can no longer distinguish between the sounds of every language on the planet. They can distinguish only between those to which they have been exposed in the past six months. A Japanese baby not exposed to “rake” and “lake” during her second six months of life cannot distinguish between those two sounds by the time she is 1 year old. As always, there are exceptions. Adults with training can still learn to distinguish speech sounds in other languages. But in general, the brain appears to have a limited window of opportunity in an astonishingly early time frame. The cognitive door begins swinging shut at 6 months old, and then, unless something pushes against it, the door closes. By 12 months, your baby’s brain has made decisions that affect her the rest of her life.
What is strong enough, Kuhl and other researchers wondered, to keep that door from closing? Say you expose your baby, in the critical period, to a tape of someone speaking a foreign language. Does the brain stay open for phonemic business? The answer is no. How about a DVD of someone speaking the foreign language? The door continues to shut. Only one thing keeps that door open to another language. You have to deliver the words through a social interaction. A real live person has to come into the room and speak the language directly to the child. If the child’s brain detects this social interaction, its neurons will begin recording the second language, phonemes and all. To perform these cognitive tasks, the brain needs the information-rich, give-and-take stimulation that another human being provides.
Tucked into these data is a bombshell of an idea, one with empirical support across the developmental sciences. Human learning in its most native state is primarily a relational exercise. Intelligence is not developed in the electronic crucibles of cold, lifeless machines but in the arms of warm, loving people. You can literally rewire a child’s brain through exposure to relationships.
Intelligence is not developed in the crucibles of machines but in the arms of warm, loving people.
Hear that laughter? That’s the sound of my son Noah, demonstrating to his old man how important honest-to-God active parenting is in teaching him how to do something as wonderful, and as human, as learning languages.
5. Interpreting nonverbal communication
Though speech is a uniquely human trait, it is nestled inside a vast world of communication behaviors, many of which are used by other animals, too. But we aren’t always communicating the same thing, as legendary dog whisperer Cesar Millan points out.
If you’ve ever seen National Geographic’s Dog Whisperer, you know Millan is a world-champion dog handler. His secret is that he thinks like a dog, not like a person, when he’s interacting with a dog. Millan told Men’s Health, “A lot of people who meet a new dog want to go over to him, touch him, and talk to him.” That is, of course, the custom when people meet a new person. But, Millan says, “in the language of dogs, this is very aggressive and confusing.” Instead, Millan says, when you meet a new dog, ignore the animal like an aloof, jilted lover. Don’t make eye contact. Let the dog come over and inspect you, sniff you. Once the dog gives you cues that he doesn’t find you a threat—like backing away or rubbing against you—then you can talk, touch, or make eye contact. When dogs attack people, they may in some cases simply be acting upon an ancient behavioral reflex involving a reaction to, of all things, somebody’s face.
Face-to-face communication in the animal world has many meanings, most of them not very nice. Extracting social information by examining the face is a powerful slice of mammalian evolutionary history. But we humans use our faces, including eye-to-eye contact, for many reasons besides communicating threats. We have the most sophisticated nonverbal message systems on the planet. From babies on up, we constantly communicate social information with our bodies in coordination with our smiles and frowns. Together they constitute the crown jewels of extrospective information—remember that term?—which is a potent way to get a point across quite quickly.
Though much mythology surrounds the meaning of body language (sometimes people cross and uncross their legs simply because their legs get tired), real findings have emerged from the study of it, some relevant to parenting. Two of the more intriguing studies involve how body language and gestures interact with human speech.
Learning sign language may boost cognition by 50 percent
Gestures and speech used similar neural circuits as they developed in our evolutionary history. University of Chicago psycholinguist David McNeill was the first to suggest this. He thought nonverbal and verbal skills might retain their strong ties even though they’ve diverged into separate behavioral spheres. He was right. Studies confirmed it with a puzzling finding: People who could no longer move their limbs after a brain injury also increasingly lost their ability to communicate verbally. Studies of babies showed the same direct association. We now know that infants do not gain a more sophisticated vocabulary until their fine-motor finger control improves. That’s a remarkable finding. Gestures are “windows into thought processes,” McNeill says.
Could learning physical gestures improve other cognitive skills? One study hints that it could, though more work needs to be done. Kids with normal hearing took an American Sign Language class for nine months, in the first grade, then were administered a series of cognitive tests. Their attentional focus, spatial abilities, memory, and visual discrimination scores improved dramatically—by as much as 50 percent—compared with controls who had no formal instruction.
Babies need face time
An important subset of gestures, you might guess, are facial expressions. Babies love to gaze at human faces. Mom’s is best of all—but they prefer any human face over any monkey face, llama face, cat face, or dog face. What are they looking for in your face? Emotional information. Are you happy, sad, threatened?
We all spend a great deal of time reading faces. A person’s nonverbal communication can confirm his or her verbal communication, undermine it, or even contradict it. Our relationships depend on our ability to interpret it. So humans read faces reflexively, and you can observe this even in an infant’s earliest hour. The skill develops over time, with the most sophisticated behavior observable about five to seven months after birth. Some people are born better at it than others. But we sometimes get it wrong. Researchers call it Othello’s Error.
In Shakespeare’s tragic play, the Moor Othello believes his wife is fooling around on him. Othello is enraged as he confronts her in their bedroom. She is naturally scared out of her mind. Seeing her panicked face, he interprets this fear as guilt, all the evidence of infidelity that he needs. Before he smothers her in bed, out come these famous love-hate words:
Ay, let her rot, and perish, and be damned to-night;
for she shall not live: no, my heart is turned to
stone; I strike it, and it hurts my hand. O, the
world hath not a sweeter creature: she might lie by
an emperor’s side and command him tasks.
Competently decoding another person’s face can take years of experience; like Othello, adults sometimes make mistakes. The only way to improve this accuracy is by interacting with other people. That’s why babies need human time in their earliest years. Not computer time. Not television time. Your baby’s brain needs interaction with you, in person, on a consistent basis.
Either that, or training by psychologist Paul Ekman.
What’s in a face
Paul Ekman, professor emeritus from the University of California–San Francisco, seldom misinterprets people’s faces. He has cataloged more than 10,000 possible combinations of facial expressions, creating an inventory called the Facial Action Coding System (FACS). This research instrument allows a trained observer to dissect an expression in terms of all the muscular actions that produced it.
Using this tool, Ekman has found several surprising things about human facial recognition. First, people all over the world express similar emotions using similar facial muscles. These universal basic emotions are happiness, sadness, surprise, disgust, anger, and fear. (The finding was originally quite startling; research at the time chalked up facial expressions mostly to cultural mores.) Second, the conscious control we can exert over our facial features is limited, which means we give away a lot of free information. The muscles that surround our eyes, for example, are not under conscious management. This may be why we tend to believe them more.
One of Ekman’s research videotapes shows an interaction between a psychiatrist and “Jane,” his very troubled patient. Jane had been suffering from such severe depression that she was hospitalized and under suicide watch. When she seemed to show real signs of improvement, she asked her physician to let her go home for the weekend. The camera is on Jane’s face, full view, when the doctor asks about her plans for the future. As Ekman slows the tape, examining it frame by frame, a sudden flash of deep desperation arcs across Jane’s face. She doesn’t seem to be able to control it. It turns out Jane was planning to kill herself when she got home, which fortunately she admitted before she was discharged. Ekman uses the tape to train police officers and mental-health professionals. He stops the tape and asks the students if they can see the flash of desperation, only a twelfth of a second long. Once they know what to look for, they can.
These flashes are called micro-expressions, facial gestures that last a fraction of a second but tend to reveal our truest feelings in response to rapid-fire questioning. Ekman found that some people could detect and interpret these micro-expressions better than others. People lie a lot, and those who could pick up these micro-expressions were terrific at detecting falsehoods. Ekman found that he could train people to read these micro-expressions, improving their ability to pick up nonverbal cues.
How can we tell that face-reading abilities are so important? In part because the brain devotes a tremendous amount of neural real estate, including an important region called the fusiform gyrus, to the single task of processing faces. Neural acreage of this size is expensive; the brain doesn’t fence off one area for such a restricted function unless it has a darn good reason.
We know the brain has face-specific regions because a person can damage them and lose the ability to recognize the people to whom the faces belong. The disorder is called prosopagnosia, or face blindness. Parents of face-blind kids have to provide them with instructions like “Remember, Drew is the one with the orange T-shirt; Madison is wearing a red dress.” Otherwise, they lose track of the kids they’re playing with. Nothing is wrong with the children’s eyes, just their brains.
Team player
Being able to correctly interpret gestures and facial expressions would have been highly prized in the merciless Serengeti. That’s because social coordination is a great survival skill, useful whether you are hunting animals bigger than you or just trying to get along with the neighbors. Among many other gifts, social coordination allows the concept of teamwork. Most researchers believe the ability to work as a team allowed us to pole-vault over our otherwise debilitating physical wimpiness.
How does interpreting faces help with teamwork? The ability to cooperate in a high-risk setting requires an intimate, moment-by-moment knowledge of another’s intentions and motivations. Knowing the forward progress of somebody’s psychological interior allows a more accurate prediction of his or her behavior (just ask any quarterback in the NFL). Reading the emotional information in someone’s face is one of the quickest ways to get these insights. And those who could do it accurately functioned better on Team Serengeti. Today, when people have a difficult time reading the emotional information embedded in faces, we call it autism. Teamwork is tough for these kids.
Innovators are nonverbal experts
Could your child’s ability to read faces and gestures predict her success in our 21st-century workforce? The investigators who studied successful entrepreneurs think so. We’ve already explored three of the five characteristics in the Innovator’s DNA study. The other two are incredibly social in origin:
• They were great at a specific kind of networking. Successful entrepreneurs were attracted to smart people whose educational backgrounds were very different from their own. This allowed them to acquire knowledge about things they would not otherwise learn. From a social perspective, this behavioral pirouette is not easy to execute. How did they manage to do it consistently? Using insights generated by the final common trait.
• They closely observed the details of other people’s behaviors. The entrepreneurs were natural experts in the art of interpreting extrospective cues: gestures and facial expressions. Consistently and accurately interpreting these nonverbal signals is probably how they were able to extract information from sources whose academic resources were so different from their own.
Want your baby to grow up to be a successful innovator? Make sure she has nonverbal skills down cold—and an inquisitiveness to match.
Not on IQ tests
From exploration, self-control, and creativity to verbal and nonverbal ability, it is clear that the intelligence stew has many ingredients. Standard IQ tests are not capable of measuring most of these elements, even though they play a powerful role in the future success of your children. Given their uniqueness, that’s not surprising. Some are so unexpected as to defy belief (your kid’s chances of being a great entrepreneur are linked to her ability to decode faces?). So you need not be discouraged if your kid isn’t in the 97th percentile on certain tests. She may have many other intellectual aspects in abundance that IQ tests are inherently incapable of detecting.
That’s not to say that everyone is a potential Einstein. These gifts are unevenly sprinkled among our children, and most have genetic components. Your autistic child may never have the warmth of a pastor, for example, no matter how hard you try. But as you know, there’s more to intelligence than just seeds.
Time to get our hands a little dirty, tilling some truly remarkable findings about the soil that makes our kids as smart as their seeds will allow them to be.
Key points
• There are aspects of your child’s intelligence about which you can do nothing; the genetic contribution is about 50 percent.
• IQ is related to several important childhood outcomes, but it is only one measure of intellectual ability.
• Intelligence has many ingredients, including self-control, creativity, communication skills, and a desire to explore.