4

Moral algebra: Towards the science of evidence-based wisdom

We are in the stuffy State House of Pennsylvania, in the summer of 1787. It is the middle of a stifling heatwave, but the windows and doors have been locked against the prying eyes of the public, and the sweating delegates – many dressed in thick woollen suits1 – are arguing fiercely. Their aim is to write the new US Constitution – and the stakes could not be higher. Just eleven years after the American colonies declared independence from England, the country’s government is underfunded and nearly impotent, with serious infighting between the states. It’s clear that a new power structure is desperately needed to pull the country together.

Perhaps the thorniest issue concerns how the public will be represented in Congress. Would the representatives be chosen by a popular vote, or selected by local governments? Should larger states have more seats? Or should each state be given equal representation – regardless of its size? Smaller states such as Delaware fear they could be dominated by larger states such as Virginia.2

With tempers as hot as the sweltering weather, the closed State House proves to be the perfect pressure cooker, and by the end of the summer the Convention looks set to self-combust. It falls to Benjamin Franklin – Philadelphia’s own delegate – to relieve the tension.

At eighty-one, Franklin is the oldest man at the Convention, and the once robust and hearty man is now so frail that he is sometimes carried into the proceedings on a sedan chair. Having personally signed the Declaration of Independence, he fears that America’s reputation in the eyes of the world hinges on their success. ‘If it does not do good it will do harm, as it will show that we have not the wisdom enough among us to govern ourselves’, he had previously written to Thomas Jefferson, who was abroad at the time.3

Franklin plays the role of the pragmatic host: after the day’s debating is over, he invites the delegates to eat and drink in his garden, just a few hundred feet from the Convention, where he may encourage calmer discussion under the cooling shade of his mulberry tree. He sometimes brings out his scientific collection, including a prized two-headed snake – which he uses as a metaphor for indecision and disagreement.

In the State House itself, Franklin is often silent, and largely influences the discussions through pre-written speeches. But when he does intervene, he pleads for compromise. ‘When a broad table is to be made, and the edges of planks do not fit, the artist takes a little from both, and makes a good joint,’ he argues during one heated debate in June.4

This pragmatic ‘carpentry’ eventually presents a solution for the issue of states’ representation – a problem that was fast threatening to destroy the Convention. The idea came from Roger Sherman and Oliver Ellsworth, two delegates from Connecticut, who proposed that Congress could be divided into two houses, each voted for with a different system. In the Lower House, representatives would be apportioned according to population size (pleasing the larger states) while the Senate would have an equal number of delegates per state, regardless of size (pleasing the smaller states).

The ‘Great Compromise’ is at first rejected by the delegates – until Franklin becomes its champion. He refines the proposal ? arguing that the House would be in charge of taxation and spending; the Senate would deal with matters of state sovereignty and executive orders – and it is finally approved in a round of voting.

On 17 September, it is time for the delegates to decide whether to put their names to the finished document. Even now, success is not inevitable until Franklin closes the proceedings with a rousing speech.

‘I confess that there are several parts of this constitution which I do not at present approve, but I am not sure I shall never approve them,’ he declares.5 ‘For having lived long, I have experienced many instances of being obliged by better information, or fuller consideration, to change opinions even on important subjects, which I once thought right, but found to be otherwise. It is therefore that the older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others.’

It is only right, he says, that a group of such intelligent and diverse men should come along with their own prejudices and passions – but he ends by asking them to consider that their judgements might be wrong. ‘I cannot help expressing a wish that every member of the Convention who may still have objections to it, would with me, on this occasion, doubt a little of his own infallibility, and to make manifest our unanimity, put his name to this instrument.’

The delegates take his advice and, one by one, the majority sign the document. Relieved, Franklin looks to George Washington’s chair, with its engraving of the sun on the horizon. He has long pondered the direction of its movement. ‘But now at length I have the happiness to know that it is a rising and not a setting sun.’

Franklin’s calm, stately reasoning is a stark contrast to the biased, myopic thinking that so often comes with great intelligence and expertise. He was, according to his biographer Walter Isaacson, ‘allergic to anything smacking of dogma’. He combined this open-minded attitude with practical good sense, incisive social skills and astute emotional regulation – ‘an empirical temperament that was generally averse to sweeping passions’.6

He wasn’t always enlightened on every issue. His early views on slavery, for instance, are indefensible, although he later came to be the president of the Pennsylvania Abolition Society. But in general – and particularly in later life – he managed to navigate extraordinarily complex dilemmas with astonishing wisdom.

This same mindset had already allowed him to negotiate an alliance with France, and a peace treaty with Britain, during the War of Independence, leading him to be considered, according to one scholar, ‘the most essential and successful American diplomat of all time’.7 And at the signing of the Constitution, it allowed him to guide the delegates to the solution of an infinitely complex and seemingly intractable political disagreement.

Fortunately, psychologists are now beginning to study this kind of mindset in the new science of ‘evidence-based wisdom’. Providing a direct contrast to our previously narrow understanding of human reasoning, this research gives us a unifying theory that explains many of the difficulties we have explored so far, while also providing practical techniques to cultivate wiser thinking and escape the intelligence trap.

As we shall see, the same principles can help us think more clearly about everything from our most personal decisions to important world events; the same strategies may even lie behind the astonishing predictions of the world’s ‘super-forecasters’.

First, some definitions. In place of esoteric or spiritual concepts of wisdom, this scientific research has focused on secular definitions, drawn from philosophy, including Aristotle’s view of practical wisdom – ‘the set of skills, dispositions and policies that help us understand and deliberate about what’s good in life and helps us to choose the best means for pursuing those things over the course of the life’, according to the philosopher Valerie Tiberius. (This was, incidentally, much the same definition that Franklin used.8) Inevitably, those skills and characteristics could include elements of the ‘tacit knowledge’ we explored in Chapter 1, and various social and emotional skills, as well as encompassing the new research on rationality. ‘Now if you want to be wise it’s important to know we have biases like that and it’s important to know what policies you could enact to get past those biases,’ Tiberius said.9

Even so, it is only relatively recently that scientists have devoted themselves to the study of wisdom as its own construct.10 The first steps towards a more empirical framework came in the 1970s, with ethnographic research exploring how people experience wisdom in their everyday lives, and questionnaires examining how elements of thinking associated with wisdom – such as our ability to balance different interests – change over a lifetime. Sure enough, wise reasoning did seem to increase with age.

Robert Sternberg (who had also built the scientific definitions of practical and creative intelligence that we explored in Chapter 1) was a prominent champion of this early work and helped to cement its credibility; the work has even inspired some of the questions in his university admission tests.11

An interest in a scientifically well-defined measure of wisdom would only grow following the 2008 financial crash. ‘There was a kind of social disapprobation for “cleverness” at the expense of society,’ explains Howard Nusbaum, a neuroscientist at the University of Chicago – leading more and more people to consider how our concepts of reasoning could be extended beyond the traditional definitions of intelligence. Thanks to this wave of attention, we have seen the foundation of new institutions designed to tackle the subject head on, such as Chicago’s Center for Practical Wisdom, which opened in 2016 with Nusbaum as its head. The study of wisdom now seems to have reached a kind of tipping point, with a series of exciting recent results.

Igor Grossmann, a Ukrainian-born psychologist at the University of Waterloo, Canada, has been at the cutting edge of this new movement. His aim, he says, is to provide the same level of experimental scrutiny – including randomised controlled trials – that we have come to expect from other areas of science, like medicine. ‘You’re going to need that baseline work before you can go and convince people that “if you do this it will solve all your problems’’,’ he told me during an interview at his Toronto apartment. For this reason, he calls the discipline ‘evidence-based wisdom’ – in the same way that we now discuss ‘evidence-based medicine’.

Grossmann’s first task was to establish a test of wise reasoning, and to demonstrate that it has real-world consequences that are independent of general intelligence, education and professional expertise. He began by examining various philosophical definitions of wisdom, which he broke down into six specific principles of thinking. ‘I guess you would call them metacognitive components – various aspects of knowledge and cognitive processes that can guide you towards a more enriched complex understanding of a situation,’ he said.

As you would hope, this included some of the elements of reasoning that we have already examined, including the ability to ‘consider the perspectives of the people involved in the conflict’, which takes into consideration your ability to seek and absorb information that contradicts your initial view; and ‘recognising the ways in which the conflict might unfold’, which involves the counter-factual thinking that Sternberg had studied in his measures of creative intelligence, as you try to imagine the different possible scenarios.

But his measure also involved some elements of reasoning that we haven’t yet explored, including an ability to ‘recognise the likelihood of change’, ‘search for a compromise’ and ‘predict conflict resolution’.

Last, but not least, Grossmann considered intellectual humility – an awareness of the limits of our knowledge, and inherent uncertainty in our judgement; essentially, seeing inside your bias blind spot. It’s the philosophy that had guided Socrates more than two millennia ago, and which also lay at the heart of Franklin’s speech at the signing of the US Constitution.

Having identified these characteristics, Grossmann asked his participants to think out loud about various dilemmas – from newspaper articles concerning international conflicts to a syndicated ‘Dear Abby’ agony aunt column about a family disagreement, while a team of colleagues scored them on the various traits.

To get a flavour of the test, consider the following dilemma:

 

Dear Abby,

My husband, ‘Ralph’, has one sister, ‘Dawn’, and one brother, ‘Curt’. Their parents died six years ago, within months of each other. Ever since, Dawn has once a year mentioned buying a headstone for their parents. I’m all for it, but Dawn is determined to spend a bundle on it, and she expects her brothers to help foot the bill. She recently told me she had put $2,000 aside to pay for it. Recently Dawn called to announce that she had gone ahead, selected the design, written the epitaph and ordered the headstone. Now she expects Curt and Ralph to pay ‘their share’ back to her. She said she went ahead and ordered it on her own because she has been feeling guilty all these years that her parents didn’t have one. I feel that since Dawn did this all by herself, her brothers shouldn’t have to pay her anything. I know that if Curt and Ralph don’t pay her back, they’ll never hear the end of it, and neither will I.

 

The response of a participant scoring low on humility looked something like this:

 

I think the guys probably end up putting their share in . . . or she will never hear the end of it. I am sure they have hard feelings about it, but I am sure at the end they will break down and help pay for it.12

 

The following response, which acknowledges some crucial but missing information, earned a higher score for humility:

 

Dawn apparently is impatient to get this done, and the others have been dragging it out for 6 years or at least nothing’s been done for 6 years. It doesn’t say how much she finally decided would be the price . . . I don’t know that that’s how it happened, just that that seems the reasonable way for them to go about it. It really depends on the personalities of the people involved, which I don’t know.

 

Similarly, for perspective taking, a less sophisticated response would examine just one point of view:

 

I can imagine that it was a sour relationship afterward because let’s just say that Curt and Ralph decided not to go ahead and pay for the headstone. Then it is going to create a gap of communication between her sister and her brothers.

 

A wiser response instead begins to look more deeply into the potential range of motives:

 

Somebody might believe that we need to honour parents like this. Another person might think there isn’t anything that needs to be done. Or another person might not have the financial means to do anything. Or it could also mean that it might not be important to the brothers. It often happens that people have different perspectives on situations important to them.

 

The high scorer could also see more possibilities for the way the conflict might be resolved:

 

I would think there would probably be some compromise reached, that Curt and Ralph realize that it’s important to have some kind of headstone, and although Dawn went ahead and ordered it without them confirming that they’d pitch in, they would probably pitch in somehow, even if not what she wanted ideally. But hopefully, there was some kind of contribution.

 

As you can see, the responses are very conversational – they don’t demand advanced knowledge of philosophical principles, say – but the wiser participants are simply more willing to think their way around the nuances of the problem.

After the researchers had rated the participants’ thinking, Grossmann compared these scores to different measures of wellbeing. The first results, published in 2013 in the Journal of Experimental Psychology, found that people with higher scores for wise reasoning fared better in almost every aspect of life: they were more content and less likely to suffer depression, and they were generally happier with their close relationships.

Strikingly, they were also slightly less likely to die during a five-year follow-up period, perhaps because their wiser reasoning meant they were better able to judge the health risks of different activities, or perhaps because they were better able to cope with stress. (Grossmann emphasises that further work is needed to replicate this particular finding, however.)

Crucially, the participants’ intelligence was largely unrelated to their wise reasoning scores, and had little bearing on any of these measures of health and happiness.13 The idea that ‘I am wise because I know that I know nothing’ may have become something of a cliché, but it is still rather remarkable that qualities such as your intellectual humility and capacity to understand other people’s points of view may predict your wellbeing better than your actual intelligence.

This discovery complements other recent research exploring intelligence, rational decision making, and life outcomes. You may recall, for instance, that Wändi Bruine de Bruin found very similar results, showing that her measure of ‘decision making competence’ was vastly more successful than IQ at predicting stresses like bankruptcy and divorce.14 ‘We find again and again that intelligence is a little bit related to wise reasoning – it explains perhaps 5% of the variance, probably less, and definitely not more than that,’ said Grossmann.

Strikingly, Grossmann’s findings also converge with Keith Stanovich’s research on rationality. One of Stanovich’s sub-tests, for instance, measured a trait called ‘actively open-minded thinking’, which overlaps with the concept of intellectual humility, and which also includes the ability to think about alternative perspectives. How strongly would you agree with the statement that ‘Beliefs should always be revised in response to new information or evidence’, for instance? Or ‘I like to gather many different types of evidence before I decide what to do’? He found that participants’ responses to these questions often proved to be a far better predictor of their overall rationality than their general intelligence – which is reassuring, considering that unbiased decision making should be a key component of wisdom.15

Grossmann agrees that a modest level of intelligence will be necessary for some of the complex thinking involved in these tasks. ‘Someone with severe learning difficulties won’t be able to apply these wisdom principles.’ But beyond a certain threshold, the other characteristics – such as intellectual humility and open-minded thinking – become more crucial for the decisions that truly matter in life.

Since Grossmann published those results, his theories have received widespread acclaim from other psychologists, including a Rising Star Award from the American Psychological Association.16 His later research has built on those earlier findings with similarly exciting results. With Henri Carlos Santos, for instance, he examined longitudinal data from previous health and wellbeing surveys that had, by good fortune, included questions on some of the qualities that are important to his definition of wisdom, including intellectual humility and open-mindedness. Sure enough, he found that people who scored more highly on these characteristics at the start of the survey tended to report greater happiness later on.17

He has also developed methods that allow him to test a greater number of people. One study asked participants to complete an online diary for nine days, with details about the problems they faced and questionnaires examining their thinking in each case. Although some people consistently scored higher than others, their behaviour was still highly dependent on the situation at hand. In other words, even the wisest person may act foolishly in the wrong circumstances.18

This kind of day-to-day variation can be seen in personality traits such as extraversion, Grossmann says, as each person’s behaviour varies from a fixed set point; a mild introvert may still prefer to be quietly alone at work, but then become more gregarious around the people she trusts. Similarly, it’s possible that someone may be fairly wise when dealing with a confrontational colleague – but then lose their head when dealing with their ex.

The question is, how can we learn to change that set point?

Benjamin Franklin’s writings offer anecdotal evidence that wisdom can be cultivated. According to his autobiography, he had been a ‘disputatious’ youth, but that changed when he read an account of Socrates’ trial.19 Impressed by the Greek philosopher’s humble method of enquiry, he determined to always question his own judgement and respect other people’s, and in his conversation, he refused to use words such as ‘certainly, undoubtedly, or any others that give the air of positiveness to an opinion’. Soon it became a permanent state of mind. ‘For these fifty years past no one has ever heard a dogmatical expression escape me,’ he wrote.

The result was the kind of humble and open mind that proves to be so critical for Grossman’s research on evidence-based wisdom. ‘I find a frank acknowledgement of one’s ignorance is not only the easiest way to get rid of a difficulty, but the likeliest way to obtain information, and therefore I practice it,’ Franklin wrote in 1755, while discussing his confusion over a recent scientific result. ‘Those who affect to be thought to know everything, and so undertake to explain everything, often remain long ignorant of many things that others could and would instruct them in, if they appeared less conceited.’20

Unfortunately, the scientific research suggests that good intentions may not be sufficient. A classic psychological study by Charles Lord in the late 1970s found that simply telling people to be ‘as objective and unbiased as possible’ made little to no difference in counteracting the myside bias. When considering arguments for the death penalty, for instance, subjects still tended to come to conclusions that suited their preconceptions and still dismissed the evidence opposing their view, despite Lord’s warnings.21 Clearly, wanting to be fair and objective alone isn’t enough; you also need practical methods to correct your blinkered reasoning.

Luckily, Franklin had also developed some of those strategies – methods that psychologists would only come to recognise centuries later.

His approach is perhaps best illustrated through a letter to Joseph Priestley in 1772. The British clergyman and scientist had been offered the job of overseeing the education of the aristocrat Lord Shelburne’s children. This lucrative opportunity would offer much-needed financial security, but it would also mean sacrificing his ministry, a position he considered ‘the noblest of all professions’ – and so he wrote to Franklin for advice.

‘In the affair of so much importance to you, where-in you ask my advice, I cannot, for want of sufficient premises, counsel you what to determine, but if you please, I will tell you how,’ Franklin replied. He called his method a kind of ‘moral algebra’, and it involved dividing a piece of paper in two and writing the advantages and disadvantages on either side – much like a modern pros and cons list. He would then think carefully about each one and assign them a number based on importance; if a pro equalled a con, he would cross them both off the list. ‘Thus proceeding I find at length where the balance lies; and if, after a day or two of farther consideration, nothing new that is of importance occurs on either side, I come to a determination accordingly.’22

Franklin conceded that the values he placed on each reason were far from scientific, but argued that when ‘each is thus considered separately and comparatively, and the whole lies before me, I think I can judge better, and am less liable to make a rash step.’

As you can see, Franklin’s strategy is more deliberative and involved than the quick lists of advantages and disadvantages most of us may scribble in a notebook. Of particular importance is the careful way that he attempts to weigh up each item, and his diligence in suspending his judgement to allow his thoughts to settle. Franklin seems to have been especially aware of our tendency to lean heavily on the reasons that are most easily recalled. As he described in another letter, some people base their decisions on facts that just ‘happened to be present in the mind’, while the best reasons were ‘absent’.23 This tendency is indeed an important source of bias when we try to reason, which is why it’s so important to give yourself the time to wait until all the arguments are laid out in front of you.24

Whether or not you follow Franklin’s moral algebra to the letter, psychologists have found that deliberately taking time to ‘consider the opposite’ viewpoint can reduce a range of reasoning errors,25 such as anchoring,26 and over-confidence,27 and, of course, the myside bias. The benefits appear to be robust across many different decisions – from helping people to critique dubious health claims28 to forming an opinion on capital punishment and reducing sexist prejudice.29 In each case, the aim was to actively argue against yourself, and consider why your initial judgement may be wrong.30*

 

* The thirteenth-century philosopher Thomas Aquinas, incidentally, used similar techniques in his own theological and philosophical inquiries. As the philosopher Jason Baehr (a modern champion of intellectual humility, who we’ll meet in Chapter 8) points out, Aquinas deliberately argued against his initial hypothesis on any opinion, doing ‘his best to make these objections as forceful or strong as possible’. He then argues against those points with equal force, until eventually his view reaches some kind of equilibrium.

 

Depending on the magnitude of the decision, you may benefit from undergoing a few iterations of this process, each time reaching for an additional piece of information that you overlooked on your first pass.31 You should also pay particular attention to the way you consider the evidence opposing your gut instinct, since you may still be tempted to dismiss it out of hand, even after you have acknowledged its existence. Instead, you might ask yourself: ‘Would I have made the same evaluation, had exactly the same evidence produced results on the other side of the issue?’

Suppose that, like Priestley, you are considering whether to take a new job and you have sought the advice of a friend, who encourages you to accept the offer. You might then ask: ‘Would I have given the same weight to my friend’s judgement had she opposed the decision?’32 It sounds convoluted, but Lord’s studies suggested this kind of approach really can overcome our tendency to dismiss the evidence that doesn’t fit our preferred point of view.

You might also try to imagine that someone else will examine your justifications, or even try to present them to a friend or colleague. Many studies have shown that we consider more points of view when we believe that we will need to explain our thinking to others.33

We can’t know if Franklin applied his moral algebra in all situations, but the general principle of deliberate open-minded thinking seems to have dictated many of his biggest decisions. ‘All the achievements in the public’s interest ? getting a fire department organised, the streets paved, a library established, schools for the poor supported, and much more ? attest to his skill in reading others and persuading them to do what he wanted them to do’, writes the historian Robert Middlekauf.34 ‘He calculated and measured; he weighed and he assessed. There was a kind of quantification embedded in the process of his thought . . . This indeed describes what was most rational about Franklin’s mind.’

This kind of thinking is not always respected, however. Particularly in crisis, we sometimes revere ‘strong’, single-minded leaders who will stay true to their convictions, and even Franklin was once considered too ‘soft’ to negotiate with the British during the War of Independence. He was later appointed as one of the commissioners, however, and proved to be a shrewd opponent.

And there is some evidence that a more open-minded approach may lie behind many other successful leaders. One analysis, for instance, has examined the texts of UN General Assembly speeches concerning the Middle East conflict from 1947 to 1976, scoring the content for the speakers’ consideration and integration of alternative points of view – the kind of open-minded thinking that was so important for Grossmann’s measure of wisdom. The researchers found that this score consistently dropped in the periods preceding a war, whereas higher scores seemed to sustain longer intervals of peace.

It would be foolish to read too much into post-hoc analyses – after all, people would naturally become more closed-minded during times of heightened tension.35 But lab experiments have found that people scoring lower on these measures are more likely to resort to aggressive tactics. And the idea does find further support in an examination of the US’s most important political crises in the last 100 years, including John F. Kennedy’s handling of the Cuban missile crisis, and Robert Nixon’s dealings with the Cambodian invasion of 1970 and the Yom Kippur War of 1973.

Textual analyses of the speeches, letters and official statements made by presidents and their Secretaries of State show that the level of open-minded thinking consistently predicted the later outcome of the negotiations, with JFK scoring highly for his successful handling of the Cuban missile crisis, and Dwight Eisenhower for the way he dealt with the two Taiwan Strait conflicts between Mainland China and Taiwan in the 1950s.36

In more recent politics, the German Chancellor Angela Merkel is famous for her ‘analytical detachment’, as she famously listens to all perspectives before making a decision; one senior government official describes her as ‘the best analyst of any given situation that I could imagine’.

The Germans have even coined a new word – merkeln (to Merkel) – that captures this patient, deliberative stance, though it’s not always meant flatteringly, since it can also reflect frustrating indecision.37 ‘I am regarded as a permanent delayer sometimes,’ she has said herself, ‘but I think it is essential and extremely important to take people along and really listen to them in political talks.’ And it has served her well, helping her to remain one of the longest-serving European leaders despite some serious economic crises.

If we recall the idea that many intelligent people are like a car speeding along the road without guidance or caution, then Merkel, Eisenhower and Franklin represent patient, careful drivers: despite their formidable engines, they know when to hit the brakes and check the terrain before deciding on their route.38

Franklin’s moral algebra is just one of many potential ways to cultivate wisdom, and further insights come from a phenomenon known as Solomon’s Paradox, which Grossmann named after the legendary king of Israel in the tenth century bc.

According to biblical accounts, God appeared to Solomon in a dream and offered to give him a special gift at the start of his reign. Rather than choosing wealth, honour or longevity, he chose wisdom of judgement. His insight was soon put to the test when two harlots appeared before him, both claiming to be the mother of a boy. Solomon ordered for the child to be cut in two – knowing that the true mother would rather renounce her claim than see her son killed. The decision is often considered the epitome of impartial judgement – and people soon travelled from across the land to receive his counsel. He led the land to riches and built Jerusalem’s Temple.

Yet Solomon is said to have struggled to apply his famously wise judgement in his personal life, which was ruled by intemperate passions. Despite being the chief Jewish priest, for instance, he defied the Torah’s commandments by taking a thousand wives and concubines, and he amassed huge personal wealth. He became a ruthless and greedy tyrant, and was so embroiled in his affairs that he neglected to educate his son and prepare him for power. The kingdom ultimately descended into chaos and war.39

Three millennia later, Grossmann has found this same ‘asymmetry’ in his own tests of wisdom. Like Solomon, many people reason wisely about other people’s dilemmas, but struggle to reason clearly about their own issues, as they become more arrogant in their opinions, and less able to compromise – another form of the bias blind spot.40 These kinds of errors seem to be a particular problem when we feel threatened, triggering so-called ‘hot’ emotional processing that is narrow and closed-minded.

The good news is that we can use Solomon’s Paradox to our advantage by practising a process called ‘self-distancing’. To get a flavour of its power, think of a recent event that made you feel angry. Now ‘take a few steps back’, almost as if you were watching yourself from another part of the room or on a cinema screen, and describe the unfolding situation to yourself. How did you feel?

In a series of experiments, Ethan Kross at the University of Michigan has shown that this simple process encourages people to take a more reflective attitude towards their problems – using ‘cool’ rather than ‘hot’ processing. He found, for instance, that they were more likely to describe the situation with more neutral words, and they began to look for the underlying reasons for their discontent, rather than focusing on the petty details.41

Consider these two examples. The first is from an ‘immersed’, first-person perspective.

 

‘I was appalled that my boyfriend told me he couldn’t connect with me because he thought I was going to hell. I cried and sat on the floor of my dorm hallway and tried to prove to him that my religion was the same as his . . .’

 

And the second is from the distanced viewpoint:

 

‘I was able to see the argument more clearly . . . I initially empathized better with myself but then I began to understand how my friend felt. It may have been irrational but I understand his motivation.’

You can see how the event became less personal, and more abstract, for the second participant – and he or she began to look beyond their own experience to understand the conflict.

Kross emphasises that this is not just another form of avoidance, or suppression. ‘Our conception was not to remove them from the event but to give them a small amount of distance, hold them back a little bit, and then allow them to confront the emotion from a healthier stance,’ he told me in an interview. ‘When you do this from an immersed perspective, people tend to focus on what happened to them. Distancing allows them to shift into this meaning-making mode where they put the event into a broader perspective and context.’

He has since repeated the finding many times, using different forms of self-distancing. You may imagine yourself as a fly on the wall, for instance, or a well-intentioned observer. Or you may try to imagine your older, wiser self looking back at the event from the distant future. Simply talking about your experiences in the third person (‘David was talking to Natasha, when . . .’) can also bring about the necessary change of perspective.

Kross points out that many people naturally self-distance to process unpalatable emotions. He points to an interview in which the basketball player LeBron James described his choice to leave the Cleveland Cavaliers (who had nurtured his career) and move to the Miami Heat. ‘One thing I didn’t want to do was make an emotional decision. I wanted to do what’s best for LeBron James and to do what makes LeBron James happy.’ Malala Yousafzai, meanwhile, used a similar approach to bolster her courage against the Taliban. ‘I used to think that the Tali[ban] would come and he would just kill me. But then I said [to myself], if he comes, what would you do Malala? Then I would reply to myself, Malala just take a shoe and hit him.’

People who spontaneously take a new perspective in this way enjoy a range of benefits, including reduced anxiety and rumination.42 Adopting that distanced perspective even helped one group of study participants to confront one of the most feared events in modern life: public speaking. Using self-distancing as they psyched themselves up for a speech, they showed fewer physiological signs of threat, and reported less anxiety, than a control group taking the immersed, first-person perspective. The benefits were also visible to observers judging their talks, too, who thought they gave more confident and powerful speeches.43

In each case, self-distancing had helped the participants to avoid that self-centred ‘hot’ cognition that fuels our bias, so that their thinking was no longer serving their anger, fear, or threatened ego. Sure enough, Grossmann has found that self-distancing resolved Solomon’s Paradox when thinking about personal crises (such as an unfaithful partner), meaning that people were more humble and open to compromise, and more willing to consider the conflicting viewpoints.44 ‘If you become an observer, then right away you get into this inquisitive mode and you try to make sense of the situation,’ Grossmann told me. ‘It almost always co-occurs with being intellectually humble, considering different perspectives and integrating them together.’

And that may have a serious impact on your relationships. A team led by Eli Finkel at Northwestern University tracked 120 married couples over a period of two years. The initial arc of their relationships was not promising: over the first twelve months, most of the couples faced a downward spiral in their relationship satisfaction, as disappointment and resentments started to build. After a year, however, Finkel gave half of the couples a short course on self-distancing – such as imagining a dispute through the eyes of a more dispassionate observer.

Compared to typical relationship counselling, it was a tiny step – the lesson in self-distancing lasted about twenty minutes in total. But it transformed the couples’ love stories, resulting in greater intimacy and trust over the following year, as they constructively worked through their differences. The control group, in contrast, continued their steady decline for the next year, as resentment continued to build.45

These are highly intimate problems, but taking a distant viewpoint also seems to remedy bias on less personal subjects. When told to imagine how citizens in other countries would view forthcoming elections, for instance, Grossmann’s participants became more open-minded to conflicting views. After the experiment, he found that they were also more likely to take up an offer to sign up for a bipartisan discussion group – offering further, objective evidence that they were now more open to dialogue as a result of the intervention.46

As the research evolves, Grossmann has now started to examine the conditions of the effect more carefully, so that he can find even more effective self-distancing techniques to improve people’s reasoning. One particularly potent method involves imagining that you are explaining the issue to a twelve-year-old child. Grossmann speculates that this may prime you to be more protective, so that you avoid any bias that could sway their young and naïve mind.47

His team call this phenomenon the ‘Socrates Effect’ – the humble, Greek philosopher correcting the egocentric passions of the mighty Israelite king.

If you still doubt that these principles will help you make better decisions, consider the achievements of Michael Story, a ‘super-forecaster’ whose talents first came to light through the Good Judgment Project – a US government-funded initiative to improve its intelligence programme.

The Good Judgment Project was the brainchild of Philip Tetlock, a political scientist who had already caused shockwaves among intelligence analysts. Whenever we turn on the TV news or read a newspaper, we meet commentators who claim to know who will win an election or if a terrorist attack is imminent; behind closed doors, intelligence analysts may advise governments to go to war, direct NGOs’ rescue efforts or advise banks on the next big merger. But Tetlock had previously shown that these professionals often perform no better than if they had been making random guesses – and many performed consistently worse.

Later research has confirmed that their rapid, intuitive decision making makes many intelligence analysts more susceptible to biases such as framing – scoring worse than students on tests of rationality.48

It was only after the US-led invasion of Iraq in 2003 – and the disastrous hunt for Saddam Hussein’s ‘Weapons of Mass Destruction’ – that the US intelligence services finally decided to take action. The result was the founding of a new department – Intelligence Advanced Research Projects Activity. They eventually agreed to fund a four-year tournament, beginning in 2011, allowing researchers to arrange the participants in various groups and test their strategies.

Example questions included: ‘Will North Korea detonate a nuclear device before the end of the year?’ ‘Who will come top of the 2012 Olympics medals table?’ And, ‘How many additional countries will report cases of the Ebola virus in the next eight months?’ In addition to giving precise predictions on these kinds of events, the forecasters also had to declare their confidence in their judgements – and they would be judged extra harshly if they were overly optimistic (or pessimistic) about their predictions.

Tetlock’s team was called the Good Judgment Project, and after the first year he siphoned off the top 2 per cent, whom he called the ‘super-forecasters’, to see if they might perform better in teams than by themselves.

Michael joined the tournament midway through the second year, and he quickly rose to be one of the most successful. Having worked in various jobs, including documentary film-making, he had returned to academia for a Master’s degree, when he saw an advert for the tournament on an economics blog. The idea of being able to test and quantify his predictions instantly appealed.

Michael can still remember meeting other ‘supers’ for the first time. ‘There are loads of weird little things about us that are very similar,’ he told me; they share an inquisitive, hungry mind with a thirst for detail and precision, and this was reflected in their life decisions. One of his friends compared it to the ending of ET, ‘where he goes back to his home planet, and he meets all the other ETs’.

Their observations tally with Tetlock’s more formal investigations. Although the super-forecasters were all smart on measures of general intelligence, ‘they did not score off-the-charts high and most fall well short of so-called genius territory’, Tetlock noted. Instead, he found that their success depended on many other psychological traits – including the kind of open-minded thinking, and the acceptance of uncertainty, that was so important in Grossmann’s research. ‘It’s being willing to acknowledge that you have changed your mind many times before – and you’ll be willing to change your mind many times again,’ Michael told me. The super-forecasters were also highly precise with their declarations of confidence – specifying 22 per cent certainty, as opposed to 20 per cent, say – which perhaps reflects an overall focus on detail and precision.

Tetlock had already seen signs of this in his earlier experiments, finding that the worst pundits tended to express themselves with the most confidence, while the best performers allowed more doubt to creep into their language, ‘sprinkling their speech with transition markers such as “however”, “but”, “although” and “on the other hand” ’.

Remember Benjamin Franklin’s determination to avoid ‘certainly, undoubtedly, or any other [phrases] that give the air of positiveness to an opinion’? More than two hundred years later, the super-forecasters were again proving exactly the same point: it pays to admit the limits of your knowledge.

In line with Grossmann’s research, the super-forecasters also tended to look for outside perspectives; rather than getting stuck in the fine details of the specific situation at hand, they would read widely and look for parallels with other (seemingly unconnected events). Someone investigating the Arab Spring, for instance, may look beyond Middle Eastern politics to see how similar revolutions had played out in South America.

Interestingly, many of the super-forecasters – including Michael – had lived and worked abroad at some point in their life. Although this may have just been a coincidence, there is some good evidence that a deep engagement with other cultures can promote open-minded thinking, perhaps because it demands that you temporarily put aside your preconceptions and adopt new ways of thinking.49

The most exciting result, however, was the fact that these skills improved with training. With regular feedback, many people saw their accuracy slowly climbing over the course of the tournament. The participants also responded to specific lessons. An hour-long online course to recognise cognitive bias, for instance, improved the forecasters’ estimates by around 10 per cent over the following year.

Often, the simplest way to avoid bias was to start out with a ‘base rate’: examining the average length of time it takes for any dictator to fall from power, for instance – before you then begin to readjust the estimate. Another simple strategy was to examine the worst- and best-case scenarios for each situation, offering some boundaries for your estimates.

Overall, the super-forecasters provided the perfect independent demonstration that wise decision making relies on many alternative thinking styles, besides those that are measured on standard measures of cognitive ability. As Tetlock puts it in his book Superforecasting: ‘A brilliant puzzle-solver may have the raw material for forecasting, but if he doesn’t also have an appetite for questioning basic, emotionally charged beliefs, he will often be at a disadvantage relative to a less intelligent person who has a greater capacity for self-critical thinking.’50

Grossmann says that he has only just come to appreciate these parallels. ‘I think there is quite a bit of convergence in those ideas,’ he told me.

Michael now works for a commercial spin-off, Good Judgment Inc., which offers courses in these principles, and he confirms that performance can improve with practice and feedback. However you perform, it’s important not to fear failure. ‘You learn by getting it wrong,’ Michael told me.

Before I finished my conversation with Grossmann, we discussed one final, fascinating experiment that took his wise reasoning tests to Japan.

As in Grossmann’s previous studies, the participants answered questions about news articles and agony aunt columns, and were then scored on the various aspects of wise reasoning, such as intellectual humility, the ability to take on board another viewpoint, and their ability to suggest a compromise.

The participants ranged from twenty-five to seventy-five years old, and in the USA, wisdom grew steadily with age. That’s reassuring: the more we see of life, the more open-minded we become. And it’s in line with some of the other measures of reasoning, such as Bruine de Bruin’s ‘adult decision-making competence scale’, in which older people also tend to score better.

But Grossmann was surprised to find that the scores from Tokyo took a completely different pattern. There was no steep increase in age, because the younger Japanese were already as wise as the oldest Americans. Somehow, by the age of twenty-five, they had already absorbed the life lessons that only come to the Americans after decades more experience.51

Reinforcing Grossmann’s finding, Emmanuel Manuelo, Takashi Kusumi and colleagues recently surveyed students in the Japanese cities of Okinawa and Kyoto, and Auckland in New Zealand, on the kinds of thinking that they thought were most important at university. Although all three groups recognised the value of having an open-minded outlook, it’s striking that the Japanese students referred to some specific strategies that sound very much like self-distancing. One student from Kyoto emphasised the value of ‘thinking from a third person’s point of view’, for instance, while a participant in Okinawa said it was important to ‘think flexibly based on the opposite opinion’.52

What could explain these cultural differences? We can only speculate, but many studies have suggested that a more holistic and interdependent view of the world may be embedded in Japanese culture; Japanese people are more likely to focus on the context and to consider the broader reasons for someone’s actions, and less likely to focus on the ‘self’.53

Grossmann points to ethnographic evidence showing that children in Japan are taught to consider others’ perspectives and acknowledge their own weaknesses from a young age. ‘You just open an elementary school textbook and you see stories about these characters who are intellectually humble, who think of the meaning of life in interdependent terms.’

Other scholars have argued that this outlook may also be encoded in the Japanese language itself. The anthropologist Robert J. Smith noted that the Japanese language demands that you encode people’s relative status in every sentence, while the language lacks ‘anything remotely resembling the personal pronoun’. Although there are many possible ways to refer to yourself, ‘none of the options is clearly dominant’, particularly among children. ‘With overwhelming frequency, they use no self-referent of any kind.’

Even the pronunciation of your own name changes depending on the people with whom you are speaking. The result, Smith said, is that self-reference in Japan is ‘constantly shifting’ and ‘relational’ so that ‘there is no fixed centre from which the individual asserts a non-contingent existence’.54 Being forced to express your actions in this way may naturally promote a tendency for self-distancing.

Grossmann has not yet applied his wise reasoning tests to other countries, but converging evidence would suggest that these differences should be considered part of broader geographical trends.

Thanks, in part, to the practical difficulties inherent in conducting global studies, psychologists once focused almost entirely on Western populations, with the vast majority of findings emerging from US university students – highly intelligent, often middle-class people. But during the last ten years, they have begun to make a greater effort to compare the thinking, memory and perception of people across cultures. And they are finding that ‘Western, Educated, Industrialised, Rich, Democratic’ (WEIRD, for short) regions like North America and Europe score higher on various measures of individualism and the egocentric thinking that appears to lie behind our biases.

In one of the simplest ‘implicit’ tests, researchers ask participants to draw a diagram of their social network, representing their family and friends and their relationships to each other. (You could try it for yourself, before you read on.)

In WEIRD countries like the USA, people tend to represent themselves as bigger than their friends (by about 6 mm on average) while people from China or Japan tend to draw themselves as slightly smaller than the people around them.55 This is also reflected in the words they use to describe themselves: Westerners are more likely to describe their own personality traits and achievements, while East Asian people describe their position in the community. This less individualistic, more ‘holistic’ way of viewing the world around us can also be seen in India, the Middle East and South America,56 and there is some emerging evidence that people in more interdependent cultures find it easier to adopt different perspectives and absorb other people’s points of view – crucial elements of wisdom that would improve people’s thinking.57

Consider measures of over-confidence, too. As we have seen, most WEIRD participants consistently over-estimate their abilities: 94 per cent of American professors rate themselves as ‘better than average’, for instance, and 99 per cent of car drivers think they are more competent than the average.58 Yet countless studies have struggled to find the same tendency in China, Korea, Singapore, Taiwan, Mexico or Chile.59 Of course, that’s not to say that everyone in these countries will always be humble, wise thinkers; it almost certainly depends on the context, as people naturally flip between different ways of thinking. And the general characteristics may be changing over time. According to one of Grossmann’s recent surveys, individualism is rising across the globe, even in populations that traditionally showed a more interdependent outlook.60

Nevertheless, we should be ready to adopt the more realistic view of our own abilities that is common in East Asian and other cultures, as it could directly translate to a smaller ‘bias blind spot’, and better overall reasoning.

We have now seen how certain dispositions – particularly intellectual humility and actively open-minded thinking – can help us to navigate our way around the intelligence trap. And with Franklin’s moral algebra and self-distancing, we have two solid techniques that can immediately improve our decision making. They aren’t a substitute for greater intelligence or education, but they help us to apply that brainpower in a less biased fashion, so that we can use it more fruitfully while avoiding any intellectual landmines.

The science of evidence-based wisdom is still in its infancy, but over the next few chapters we will explore convergent research showing how cutting-edge theories of emotion and self-reflection can reveal further practical strategies to improve our decision making in high-stakes environments. We’ll also examine the ways that an open-minded, humble attitude, combined with sophisticated critical thinking skills, can protect us from forming dangerous false beliefs and from ‘fake news’.

Benjamin Franklin continued to embody intellectual humility to the very end. The signing of the Constitution in 1787 was his final great act, and he remained content with his country’s progress. ‘We have had a most plentiful year for the fruits of the earth, and our people seem to be recovering fast from the extravagant and idle habits which the war had introduced, and to engage seriously in the contrary habits of temperance, frugality, and industry, which give the most pleasing prospects of future national felicity’, he wrote to an acquaintance in London in 1789.61

In March 1790, the theologian Ezra Stiles probed Franklin about his own beliefs in God and his chances of an afterlife. He replied: ‘I have, with most of the Dissenters in England, some doubts as to [Jesus’s] divinity, though it is a question I do not dogmatise upon, having never studied it, and think it needless to busy myself with it now, when I expect soon an opportunity of knowing the truth with less trouble.

‘I shall only add, respecting myself, that, having experienced the goodness of that Being in conducting me prosperously through a long life, I have no doubt of its continuance in the next, though without the smallest conceit of meriting such goodness.’62 He died little more than a month later.