CHAPTER 7

INTUITION VERSUS RATIONALITY, AND HOW TO BECOME REALLY GOOD AT WHAT YOU DO

Intuition will tell the thinking mind where to look next.

—JONAS SALK, DISCOVERER OF THE POLIO VACCINE

HAVE YOU EVER BOUGHT A CAR? HAVE YOU EVER BEEN on a date? If your answer is yes, then you may agree that often people should behave very differently in those two situations. Allow me to explain.

When you buy a car, you probably look up reviews of different models, come up with a list of desiderata in accordance with your specific needs, visit different dealerships, try out different models in person, and then sit down and ponder all this information, perhaps creating a table to help you compare your top choices. Then you make a choice and buy the car. At least, this is how most people think they ought to behave when purchasing a vehicle, even though a good number of us are likely to end up following the “gut feeling” we happen to have in response to a particular model, often for no reason other than that we look or feel good in it (and the color is definitely the right one). My point is this: we think we should act rationally when purchasing a car because it is an important investment of money and it is important for our safety. The fact that more often than not we end up following our intuition instead of our reason is just an unfortunate part—in this case—of being human.

Compare that with being on a date with someone you may get involved with. Few people would argue that the best approach is to read the other person’s reviews (which in these days of electronic dating we can actually do!) or come up with lists of qualities we “must have” or “need to avoid.” This is about love, right? It’s not particularly romantic to invite someone to dinner and then show up with roses in one hand and a checklist and pencil (or an iPhone app) in the other. This is a situation where you are supposed to trust your intuition, because it’s about love and commitment, not something as mundane as buying a car. Then again, one could look at the facts and argue that the decision to engage in a relationship is a hell of a lot more important—and carries a lot more consequences—than buying a car. Not to mention those oft-cited pesky statistics about the high divorce rates in modern society. Shouldn’t we in fact try to keep our hormones and intuitions on hold and whip out our checklist instead?

These two examples illustrate our ambivalent attitude toward rational thinking (the checklist and library research approach) and intuition (the gut feeling, though it obviously has nothing to do with our digestive system). Indeed, we keep hearing a lot of nonsense about these two modes of thinking and about their relationship to each other. Some people think of themselves as intuitive across the board, a notion that we will see is unfounded. Others—most famously the ancient philosophers Plato and Aristotle—think that it’s necessary for reason to override emotive reactions (which are often confused with intuitions). Still more people—Enlightenment philosopher David Hume among them—believe that reason is just a means to an end, but that the real drive of human action is always emotional.

In this chapter, we are going to take a look at intuition and how it relates to conscious cognitive thinking, because these are the two channels through which our minds make sense of the world. As we shall see, intuition and rational cognition can work in concert to allow us to do all sorts of interesting things—from choosing the right car to falling in love with the right person—and even to excel at whatever it is we wish to do, from playing chess to learning a musical instrument to simply being as good as we can be at our daily job.

The word intuition comes from the Latin intuir, which appropriately means “knowledge from within.” Until recently, intuition, like consciousness (see Chapter 10), was the sort of thing that self-respecting scientists stayed clear of, on penalty of being accused of engaging in New Age woo-woo rather than serious science. Heck, even most philosophers—who historically had been very happy to talk about consciousness, far ahead of the rise of neurobiology—found themselves with not much to say about intuition. However, these days cognitive scientists think of intuition as a set of nonconscious cognitive and affective processes; the outcome of these processes is often difficult to articulate and is not based on deliberate thinking, but it’s real and (sometimes) effective nonetheless. It was William James, the father of modern psychology, who first proposed the idea that cognition takes place in two different modes, and his insight anticipated modern so-called dual theories of cognition. Intuition works in an associative manner: it feels effortless (even though it does use a significant amount of brain power), and it’s fast. Rational thinking, on the contrary, is analytical, requires effort, and is slow. Why, then, would we ever want to use a system that makes us work hard and doesn’t deliver rapid results? Think of it this way: intuitions, contrary to much popular lore, are not infallible. Cognitive scientists treat them as quick first assessments of a given situation, as provisional hypotheses in need of further checking. Sometimes you have to make a decision fast and on the basis of relatively little information—so intuition is the only game in town. But if you can afford to deliberate on the issue and collect further data, then it will pay off to get your overt thinking and your unconscious thinking working together on the problem at hand.

One of the first things that modern research on intuition has clearly shown is that there is no such thing as an intuitive person tout court. Intuition is a domain-specific ability, so that people can be very intuitive about one thing (say, medical practice, or chess playing) and just as clueless as the average person about pretty much everything else. Moreover, intuitions get better with practice—especially with a lot of practice—because at bottom intuition is about the brain’s ability to pick up on certain recurring patterns; the more we are exposed to a particular domain of activity the more familiar we become with the relevant patterns (medical charts, positions of chess pieces), and the more and faster our brains generate heuristic solutions to the problem we happen to be facing within that domain.

And of course, just as with everything else that we think or feel, we can now pinpoint which areas of the brain are most involved with intuition (as opposed to overt cognition). The list includes the amygdala, the basal ganglia, the nucleus accumbens, the lateral temporal cortex, and the ventromedial prefrontal cortex. The inclusion of the amygdala is particularly revealing, since, as we’ve seen, that’s the part of the human brain that is most associated with emotions. Because of this association, intuitions are accompanied by a strong “gut feeling” that we are right. Intuitive responses and emotional responses are not exactly the same thing neurologically speaking, but they share some of the same brain circuitry and are therefore difficult to disentangle. That can be a problem when we end up feeling sure of an intuition that turns out to be wrong—we buy a lemon car, or we marry the wrong person.

The deep connection between emotions and intuition was evident in a study conducted by Christian Jordan and his colleagues at Wilfrid Laurier University. They were interested in the relationship between trust in intuition and implicit self-esteem. People can have implicit or explicit self-esteem, the former being predictive of nonverbal indicators of anxiety (skin conductance, for instance), and the latter being correlated with conscious reports of anxiety (that is, when one is aware of being anxious). Interestingly, implicit and explicit self-esteem are often uncorrelated, but Jordan and his collaborators discovered that if people trust their intuitions, their implicit self-esteem increases and implicit and explicit self-esteem become positively correlated. Conversely, if people distrust their intuitions, their implicit self-esteem goes down and the relationship between implicit and explicit self-esteem breaks down or even becomes negative.

This relationship between trust in one’s intuition and implicit self-esteem is not just correlational: Jordan and his colleagues were able to experimentally manipulate their subjects’ trust in intuition simply by telling half the subjects, “There is clear evidence that people who adopt an intuitive approach to decision-making are more successful in many areas of their lives,” while telling the other half, “People who adopt a rational approach to decision-making are more successful.” Astonishingly, this simple priming worked: when participants’ trust in intuition was measured, sure enough, those who were told that intuition is a better guide for decision-making scored significantly higher in their trust in intuition than those who were told that rational decision-making is superior. Crucially, the subjects who had been experimentally manipulated to have more trust in intuition also showed augmented intrinsic and extrinsic self-esteem, with the latter two becoming positively correlated.

Why should this be? Jordan and his collaborators speculate that people experience implicit self-esteem as a particular form of intuition, so it follows that if they are inclined to trust intuition in general their implicit self-esteem will go up, while if they are not inclined to trust their intuition their implicit self-esteem will go down. With these findings, we can begin to appreciate how complex and subtle human behavior is, and why people are so emotionally attached to their intuitive abilities—regardless of how good their intuitions about particular problems actually are.

So intuitions are good for a number of reasons: they are effortless, they often provide us with efficient shortcuts as we tackle complex problems, and they can even alter our self-esteem through their connection with our affective responses. Still, remember the idea that intuitions should be treated as provisional hypotheses to be tested in the light of conscious reason, if at all possible (and if the effort is worth it)? That idea leads us to ask: can people learn to accord limited trust to their intuitions and to move on to full cognitive engagement when the situation requires it? Research by Adam Alter and his collaborators at Princeton, the University of Chicago, and Harvard shows us intriguing clues to the answer.

There are some standard situations in which people do switch from intuitive to explicit analyses of a problem, usually (though certainly not always) when they have something personal at stake in the outcome or when they know that they will be called to account for their decisions. However, people who are under time pressure or who are experiencing “cognitive load” (that is, they are simultaneously engaged in other tasks that deplete their brain resources) will rely on intuition and be less likely to correct errors that may arise from it. As it turns out, our brain even comes equipped with mechanisms that tell us (subconsciously) when it is or isn’t time to trust our intuition—a metaintuitive mechanism about intuition itself, if you will.

Alter and his colleagues have investigated the effect on the use of intuition of what they call “disfluency,” a measure of how comfortable we are with the information we are receiving. It turns out that the more disfluent we are about something, the less we rely on intuition and the more we engage in full-bore analytical reasoning. Neurologically speaking, disfluency triggers the anterior cingulate cortex, which activates the prefrontal cortex, where much of our analytical thinking takes place.

What causes disfluency? A number of factors are involved, some of them very simple and conveniently easy to manipulate experimentally. For instance, Alter and his group simply provided information about a particular problem or situation to their subjects in one of two forms: written in a clearly legible type font or written in a type font that was a bit more difficult to read. (Follow-up experiments eliminated the possibility that the crucial difference between the two treatments was a simple slowing down caused by the less legible font.) Subjects who received the disfluent write-up provided more accurate responses to the test as a direct result of their more systematic (less intuitive) processing of the information thus made available. It seems that the brain really needs for things to get difficult before it can be bothered to engage its more sophisticated, effortful, and time-consuming thinking apparatus!

The experiments conducted by Alter’s group even led to the discovery that self-induced manipulations can do the trick. Subjects were asked to furrow their brows in an effort to simulate the expression of someone engaged in deep thinking, while the control group was asked to puff out their cheeks, presumably an activity unrelated to the type of thinking one does. The people who furrowed their brows turned out to be less confident in their intuitions, as a result of which they engaged in more effortful thinking, which led to better results on their tests. The next time you want to trick your brain into behaving intellectually, just imitate some of the stereotyped postures of an intellectual and your prefrontal cortex will get the message.

To achieve the most productive balance between intuition and analytical thinking is obviously crucial to making the best decisions we can in our everyday lives, but it turns out that this balance is also important for businesses (and governments). Accordingly, some authors have begun to pay attention to how managers in the business world achieve this balance. The prevalent business culture until recently has put an emphasis on deliberation and on the explicit use of all the available information to achieve the best possible results. But as Marta Sinclair and Neal Ashkanasy point out in a paper they coauthored on this subject, analytical decision-making in business has an efficiency rate of about 50 percent. Although one could very well argue that a 50 percent success rate when dealing with complex problems characterized by dozens of variables is actually not bad at all, Sinclair and Ashkanasy and other researchers began to investigate whether an increased emphasis on intuition in business decisions has an effect on this efficiency rate. Their conclusion? That the best approach to business decisions is an integrative model in which domain-specific (that is, based on expertise) intuition and rational thinking are used in concert, just as cognitive scientists have suggested we do in other areas of life. It seems that the old counterposition between intuition and analysis is finally giving away to a more—shall we say?—reasonable understanding of the human mind and how to get the most out of it.

Throughout this discussion so far you may have noticed that I have made no mention of gender or culture, and you may be eager to ask: Isn’t it well known that women are more intuitive than men? And that Asian cultures emphasize more holistic thinking as opposed to the Western analytical approach? It turns out that there is good evidence for the latter generalization (though we do not know exactly why), and very little, if any, support for the former. The first hint that we should be suspicious of the idea that women are more intuitive than men comes from our earlier realization that there simply isn’t any such thing as general purpose intuition. We can be very intuitive at X, and horribly not intuitive at anything but X. And sure enough, neither the research conducted by Jordan and his collaborators nor the study by Sinclair and Ashkanasy turned up much evidence of gender-specific differences in intuitive abilities. Indeed, the whole—perennially popular—cottage industry purveying “men are from Mars and women are from Venus” nonsense has been thoroughly and convincingly debunked by a number of authors, including Cordelia Fine (in her Delusions of Gender) and Rebecca M. Jordan-Young (author of Brain Storm: The Flaws in the Science of Sex Differences), though I’m sure the myth will persist, as myths are prone to do.

What about intercultural differences? Those are easier to verify and quantify, though we can only speculate about why these differences are present. For instance, research conducted by Emma Buchtel and Ara Norenzayan showed that Korean college students consistently think intuition is more important than logic, while American students rank the two approaches in reverse. (Although a look at the details reveals that the first difference was statistically significant while the second one was not.) The same researchers also compared Canadians of European origin with Canadians of East Asian origin: they found that both groups think of intuitive people as more social, but East Asians (not Europeans) also think of them as wiser and more reasonable.

This kind of study is, of course, entirely descriptive. It is neither prescriptive (it doesn’t tell us which approach is more effective) nor causally explicative (it doesn’t tell us why the intercultural differences are there to begin with). From what we have seen so far, it seems likely that, from a prescriptive perspective, Westerners would do well to rely a bit more on intuition, while Asians would benefit from a more systematic use of analytical thinking. And there are two major types of explanations for why different cultures seem to have these propensities, though it is hard to imagine how either one could be tested empirically. (Moreover, they are certainly not mutually exclusive and could very well reinforce each other.)

One possibility is that Asian cultures are characterized by more social interconnectivity and mutual interdependence than Western ones and that holistic-intuitive thinking simply works better in the Asian social environment, while atomistic-analytic thinking works better in the Western one. The other explanation is historical: Western thought arose out of the philosophy of ancient Greece, the birthplace of logic and analysis, while Asian thinking has been influenced by Confucian and Taoist traditions—such as the concept of wu-wei, or effortless action, which can easily be interpreted as a form of intuition. Whether certain historical roots gave birth to a particular type of culture or certain historical traditions took hold because Aristotle and Confucius, for instance, found themselves in different environmental milieus, it is hard to tell. If the latter (which I do think is more likely), we are still left with no explanation for why some human cultures tend to be more socially interdependent while others favor individualism. The difference is highly unlikely to be genetic in nature (there are very few systematic differences in the genetic makeup of different ethnic groups), so perhaps the answer can be found in differences in the physical environment combined with historical contingencies.

Be that as it may, the different emphases that Asians and Westerners put on intuitive versus analytical approaches should be taken into consideration by both businesses and governments when dealing with each other. As Buchtel and Norenzayan note, for instance, educators from the two traditions sometimes disparage each other: some Asian educators think that Western students are dogmatic and simplistic in their approach to problems, and some Western educators think of Asian students as not sufficiently rigorous from a logical perspective. This “mutual yuck” effect, as Buchtel and Norenzayan call it, ends up affecting international trade and intergovernmental relations, as when American representatives to the World Trade Organization complained that their Chinese counterparts were neither explaining nor substantiating their decisions; to the Chinese, this was an odd complaint based on a refusal to see problems and solutions holistically. Learning about how the brain works apparently can affect not only our individual lives but also human interactions in the wider world.

There is another aspect to the question of intuition versus conscious thinking that affects our quality of life, and that has to do with research showing how people get better at what they do or get stuck in it. If you are lucky enough to have a job you really enjoy, the kind that makes you look forward to getting up on Monday mornings, you are also likely to already be in a rarefied group of human beings who wish to be better for the sake of being better at their chosen profession—a goal that significantly affects how meaningful you think your life is. But regardless of whether you do your job because you love it or simply because it’s a good way to make sure you can pay your bills and provide for your family, doing it better is a ticket to more satisfaction and possibly a raise or promotion. Or consider another possibility: maybe your job is just a job, and that is why you love to spend your free time playing an instrument, practicing a sport, or engaging in another activity that enriches your life. Even then, the better you are at your hobby the more satisfaction you will gain from it. In any of these cases, you may want to know about research on expertise.

An “expert” is someone who performs at a very high level in a given field, be it medicine, law, science, chess, tennis, or soccer. As it turns out, people become experts (or simply, much much better) at what they do when they use their intuition and conscious thinking in particular ways. Research on acquiring skills shows that, roughly speaking, and pretty much independently of whether we are talking about a physical activity or an intellectual one, people tend to go through three phases while they improve their performance. During the first phase, the beginner focuses her attention simply on understanding what it is that the task requires and on not making mistakes. In phase two, such conscious attention to the basics of the task is no longer needed, and the individual performs quasi-automatically and with reasonable proficiency. Then comes the difficult part. Most people get stuck in phase two: they can do whatever it is they set out to do decently, but stop short of the level of accomplishment that provides the self-gratification that makes one’s outlook significantly more positive or purchases the external validation that results in raises and promotions.

Phase three often remains elusive because while the initial improvement was aided by switching control from conscious thought to intuition—as the task became automatic and faster—further improvement requires mindful attention to the areas where mistakes are still being made and intense focus to correct them. Referred to as “deliberate practice,” this phase is quite distinct from mindless or playful practice. Think of a soccer (or football or baseball) player, for instance. Just mindlessly kicking the ball back and forth or playing a no-stakes pickup game isn’t going to improve her skills, no matter how many hours a week she does it, because those activities simply reinforce the automatic, intuitive style of play she has acquired up to that point. To make it to the next level, she has to concentrate on the plays and moves that still don’t come easily or naturally for her, and in order to do that she has to identify those problem areas (probably with the help of a coach) and then focus her mind on overriding her intuitive and already ingrained way of handling those very areas. It’s hard work, and it requires a fine balance between intuitive (fast, automatic) and conscious (slow, mindful) thinking. Without it, however, the player’s skill development becomes “arrested,” stuck at an intermediate level that is likely to become increasingly frustrating and to affect not just her career (or hobby, if she is not a professional athlete) but her quality of life (if the activity constitutes an important area of her life).

Researchers have also figured out how long it takes to develop simple proficiency in a given field or activity versus actual expertise. And again, roughly speaking, the results are about the same regardless of whether we are talking about playing the violin, chess, or tennis. The good news is that simple proficiency can be achieved in a matter of weeks or months. The bad news is that expert-level proficiency requires on average ten years of practice! Chess players don’t get good enough to compete in international tournaments if they don’t engage the game for about a decade (and it takes another decade to reach the level of chess master, if one is capable of that much). Moreover, the ten-year (approximate) rule applies even to particularly talented individuals, like a child prodigy.

Why should this be? There are a variety of reasons, but two are especially important: one needs to develop the ability to anticipate problems, and this in turn is often the result not just of knowledge of a given field but of structured knowledge. For instance, studies of tennis players show that the best ones don’t simply react to whatever their opponent is serving them but are capable of anticipating where the ball will go before the opponent even hits it. There is no magic to this ability, of course; it is an acquired intuitive skill made possible by the brain having seen enough similar situations to extract patterns and thus predict where the ball is most likely to go from the anticipated angle of impact on the opponent’s racquet. Similarly, what makes a chess player good is not that he has memorized a high number of specific configurations on a chessboard and is able to recall them at will; human chess players are not computers—they don’t store information that way. Rather, the chess master’s long experience has allowed him to acquire structured knowledge, a built-in understanding of the strategy of the game that makes it possible to intuit the likeliest ways out of any difficult configuration on the chessboard. Indeed, when chess masters are presented with random board configurations—that is, configurations that are not likely to arise in an actual game—the fact that they can’t recall them with any particular accuracy shows that their memory is of structured situations, not of the simple layout of pieces on the board.

Cindy Hmelo-Silver and Merav Green Pfeffer have investigated this difference between superficial and structural knowledge in the particular case of people’s understanding of something as mundane as aquaria. They compared how four groups of people understand aquaria: children, “naive” adults (adults with no particular interest in the subject matter), and two types of experts—biologists with an interest in ecology and hobbyists who like, build, and care for aquaria. Not surprisingly, children and naive adults displayed a much simpler understanding of the workings of an aquarium, often resorting to one type of causal explanation and failing to appreciate the intricacies of the system. Experts, on the other hand, were appreciative of the systemic functioning of an aquarium and could describe multiple causal pathways affecting the enclosed ecosystem. But here is the interesting part of Hmelo-Silver and Green Pfeffer’s findings: the two groups of experts also differed dramatically in the kind of knowledge of aquaria they had built. Regarding aquaria as microcosms of natural ecosystems, biologists explained them at an abstract-theoretical level in terms of the science of ecosystems. Hobbyists, on the other hand, built their mental model around the practical issues associated with filtering systems, feeding systems, and generally anything that played a direct role in keeping the aquarium looking good and the fish healthy. Not only is there a difference between naive and expert knowledge, but there is more than one way to acquire expert knowledge, guided not just by the intrinsic properties of the system but also by the particular kinds of interest that different individuals have in that system.

But, one might object, all this talk of improving one’s skills, of becoming chess masters and expert soccer players, surely neglects the idea of talent. Some people are just innately good at certain things, and others will never get to Carnegie Hall, no matter how much they practice. Perhaps, but hard evidence for the idea of innate talent is surprisingly hard to find, as explained in detail in David Shenk’s The Genius in All of Us: Why Everything You’ve Been Told About Genetics, Talent, and IQ Is Wrong. As Philip Ross explained in an article in Scientific American, very often people who talk about talent confuse innate ability with precocity. The two are certainly not the same thing: some children may display an early aptitude for, say, music, but from then on it is practice, practice, practice that turns them into actual prodigies (and gets them to Carnegie Hall). Having studied nature-nurture issues for my entire career as a scientist, I certainly am not about to downplay the importance of one’s genetic makeup and its effect on all aspects of our lives. But it is unfortunate that in many people’s minds the step from acknowledgment of genetic differences to belief in some sort of entirely unsubstantiated and downright pernicious form of genetic determinism seems to be extremely short; indeed, an oversimplified belief in innateness probably accounts for a substantial degree of human unhappiness, including persistent “scientific” attempts to show that particular ethnic groups (or women) are innately inferior at X whenever X happens to be something that furthers the interests of members of another ethnic group (or of males).

There is one more issue that we need to be aware of when it comes to expertise, since it affects so many aspects of our lives. There are, demonstrably, fields where alleged experts are no such thing at all, and you will pay with money, time, and emotional resources if you fall for their claims. Anders Ericsson, in The Cambridge Handbook of Expertise and Expert Performance, refers to studies that show, for instance, that so-called wine experts perform only slightly better than regular wine drinkers when they cannot read the label of the wine they are tasting. Knowing this could save you hundreds of bucks at the liquor store, and of course, if more widely appreciated, could shake the foundations of the multimillion-dollar wine industry. Similarly, Philip Ross points out that the evidence shows that psychiatric therapists with PhDs don’t actually help their patients much more than those with a master’s degree, and that—more ominously considering the worldwide financial upheavals of recent years—professional stockbrokers don’t do any better than amateurs at picking winning stocks. But I’m sure you knew that already.