RULE TEN

Keep an open mind

A man with a conviction is a hard man to change.
Tell him you disagree and he turns away. Show him
facts or figures and he questions your sources.
Appeal to logic and he fails to see your point.

LEON FESTINGER, HENRY RIECKEN and STANLEY SCHACHTER, When Prophecy Fails1

Irving Fisher was one of the greatest economists who ever lived.2

‘Anywhere from a decade to two generations ahead of his time,’ opined the first Nobel laureate economist Ragnar Frisch, in the late 1940s, more than half a century after Fisher’s genius first lit up his subject. Paul Samuelson, who won the Nobel Memorial Prize the year after Frisch, said that Irving Fisher’s 1891 PhD thesis ‘was the greatest doctoral dissertation in economics ever written’.

That’s what Fisher’s peers thought. The public loved him too. A hundred years ago, Irving Fisher was the most famous economist on the planet. Yet Fisher is remembered now only by economists with a sense of history. He’s no longer a household name like Milton Friedman, Adam Smith or John Maynard Keynes, his younger contemporary. That is because something awful happened to Irving Fisher, and to his reputation – something with a lesson for us all.

Fisher’s downfall certainly wasn’t through lack of ambition. ‘How much there is I want to do!’ he wrote to an old school friend while studying at Yale. ‘I always feel that I haven’t time to accomplish what I wish. I want to read much. I want to write a great deal. I want to make money.’

It was understandable that money was important to Fisher. His father had died of tuberculosis the very week that Irving arrived at Yale. Fisher’s drive and intellect kept him afloat: he won prizes in Greek and Latin, for algebra and mathematics, for public speaking (finishing second to a future US Secretary of State), and was both the class valedictorian and a member of the rowing crew. Yet amid all these achievements, the young man needed to scramble for funds throughout his studies; he understood what it was to struggle financially while surrounded by wealth.

At the age of twenty-six, however, Fisher found himself with a small fortune at his disposal. He married a childhood playmate, Margaret Hazard, who was the daughter of a wealthy industrialist. Irving and Margaret’s wedding in 1893 was sumptuous enough to be covered by the New York Times, with two thousand invited guests, three ministers, an extravagant lunch and a 60lb wedding cake. They commenced a fourteen-month European honeymoon and returned to a brand-new mansion at 460 Prospect Street, New Haven. It had been built in their absence as a wedding present from Margaret’s father and was furnished with a library, a music room and spacious offices.

There are three things you should know about Irving Fisher.

The first is that he was a health fanatic. This was understandable. Tuberculosis had killed the young man’s father; fifteen years later, the disease nearly killed him, too. No wonder he adopted a fastidious health regime: he abstained from alcohol, tobacco, meat, tea, coffee and chocolate. One dinner guest enjoyed his hospitality while noting his quirkiness: ‘While I ate right through my succession of delicious courses, he dined on a vegetable and a raw egg.’3

This wasn’t just a personal matter: he was an evangelist for health and nutrition. He founded the ‘Life Extension Institute’ and persuaded William Taft, who’d just stepped down as President, to be its chairman. (It may seem an ironic choice: Taft was obese, the heaviest man ever to be President. Taft’s weight problem, did, however, prompt his interest in diet and exercise.) In 1915, when he was nearly fifty years old, he published a book titled How to Live: Rules for Healthful Living Based on Modern Science. (How to live! Now that’s real ambition.) It was a huge bestseller, and it’s hilarious from a modern perspective. ‘[I advocate a] sun-bath . . . common sense must dictate its intensity and duration’ . . . ‘it is important [to] practice thorough mastication . . . chewing to the point of natural, involuntary swallowing’. He even adds a discussion of the correct angle between the feet while walking – ‘about seven or eight degrees of out-toeing in each foot’.4

And there’s a short section on eugenics. It hasn’t aged well.

But while it’s easy to laugh at the book, How to Live is in many ways as far ahead of its time as Fisher’s economic analysis. Fisher applied scientific thinking to the question of well-being. He described detailed exercises, preached mindfulness, and at a time when the majority of doctors were smokers, correctly warned that tobacco causes cancer.

That is the second thing you need to know about Irving Fisher: he believed in the power of rational, numerical analysis, in economics and elsewhere. He calculated the net economic cost of tuberculosis. He conducted experimental investigations of vegetarianism and even of thorough mastication, which he found to increase endurance. (A 1917 advertisement for the breakfast cereal Grape Nuts included an endorsement from Professor Fisher.) At one point in How to Live, he even pauses to inform the reader that ‘in the modern study of scientific clothing there is a new unit, the “clo”. This is a technical unit for measuring the “warming power” of clothing.’

It is arguable that his love of numbers occasionally led him astray. For example, when Fisher quantified the benefits of prohibition, he exuberantly generalised from a small study that a stiff drink on an empty stomach made workers 2 per cent less efficient. Fisher calculated that prohibition would add $6 billion to America’s economy – which at the time was an absolutely enormous gain. We saw in the first chapter that Abraham Bredius’s art expertise had highlighted reasons to believe that Han van Meegeren’s rotten forgery was truly a Vermeer. Similarly, Fisher’s statistical expertise allowed him to produce grand calculations about prohibition on a shaky foundation stone. His strong feelings about the evils of alcohol were undermining the rigour of his statistical reasoning.5

There’s also the money – that’s the third thing you need to know. Irving Fisher was rich, and not just because of his wife’s inheritance. Making money was a matter of pride for Fisher; he didn’t want to be dependent on his wife. There were the book royalties from How to Live. There were his inventions, most notably a way of organising business cards that was the forerunner of the Rolodex. He sold that invention to a stationery company for $660,000 in cash – many millions of dollars in today’s terms – a seat on its board and a bundle of stocks.

Fisher turned his academic research into a major business operation called the ‘Index Number Institute’. It sold data, forecasts and analysis as a syndicated package, ‘Irving Fisher’s Business Page’, to newspapers across the United States. Forecasting was a natural extension of the data and analysis. After all, if we want to make the world add up, it’s not always because the intellectual joy of understanding is an end in itself. Sometimes we’re interested in sizing up the current situation as a means to anticipate, and perhaps profit from, what will happen next.

With such a platform, Fisher was able to evangelise about his approach to investment – which, broadly speaking, was to bet on American growth by buying shares in the new industrial corporations using borrowed money. Such borrowing is often called leverage, since it magnifies both profits and losses.

But during the 1920s, stock market investors had few losses to worry about. Share prices were soaring. Anyone who had made leveraged bets on that growth had every reason to feel clever. Fisher wrote to his old childhood friend to inform him that his ambition had been fulfilled. ‘We are all making a lot of money!’

In the summer of 1929, Irving Fisher – bestselling author, inventor, friend of presidents, entrepreneur, health campaigner, syndicated columnist, statistical pioneer, the greatest academic economist of his generation, and a millionaire many times over – was able to boast to his son that a renovation of the family mansion had been paid for not by Hazard family money but by Irving Fisher himself.

That achievement mattered to him. Fisher’s own father hadn’t lived to see his seventeen-year-old boy grow into one of the most respected figures of the age; as Irving and his son watched a mansion reshaped before them, he could, perhaps, be forgiven his pride. But he was standing on the brink of a financial precipice.

*

The stock market cracked in the autumn of 1929. The Dow Jones Industrial Average fell by more than a third between the beginning of September and the end of November. But it wasn’t the great Wall Street crash that did for Irving Fisher – at least, not immediately. The crash, of course, was a cataclysmic financial event, one far more severe even than the banking crisis of 2008. The Great Depression that followed was the greatest peacetime economic calamity to befall the western world. Fisher was more exposed than many, since he had made his investments with leverage, magnifying both losses and gains.

But it took more than a leveraged bet on a financial bubble to ruin Fisher. It took stubbornness. The crash had its dramatic moments, but it was not simply a matter of lurches on days such as ‘Black Thursday’ or ‘Black Monday’. It was best understood as a long downward grind, punctuated by brief rallies, all the way from 380 points in September 1929 to just over 40 points by the summer of 1932. If Fisher had cut his losses and stepped back from the market in late 1929, he would have been fine. He could have returned to his academic research and his many other enthusiasms, and his luxurious lifestyle funded by many years of trading profits along with his income as an author and businessman.

Instead, Fisher doubled down on his initial views. He was convinced the market would turn upward again. He made several comments about how the crash was the ‘shaking out of the lunatic fringe’ and reflected ‘the psychology of panic’. He publicly declared that recovery was imminent. It was not.

Most important, he didn’t just stay invested in the market. His confidence that he was right made him continue to rely on borrowed money in the hope of bigger gains. One of Fisher’s major investments was in Remington Rand, following the sale of his Rolodex system, ‘Index Visible’. The share price tells the story: $58 before the crash, $28 within a few months. Fisher might have learned by then that leverage was terribly risky. But no: he borrowed more money to invest – and the share price soon dropped to $1. That is a sure route to ruin.

We shouldn’t be too quick to judge Fisher. Even if you’re the smartest one in the room – and Irving Fisher usually was – it simply isn’t easy to change your mind.

Irving Fisher’s contemporary, Robert Millikan, was no less distinguished a man than Fisher. His interests were a little different, however: Millikan was a physicist. In 1923, as Fisher’s stock tips were being devoured across the United States, Millikan was collecting a Nobel Prize.

For all his achievements, Millikan is most famous for an experiment so simple that a school kid can attempt it: the ‘oil drop’ experiment, in which a mist of oil droplets from a perfume spritzer is given an electrical charge while floating between two electrified plates. Millikan could adjust the voltage between the plates until oil drops were suspended, without moving – and since he could measure the diameter of the drops, he could calculate their mass, and thus also the electrical charge that was precisely offsetting the pull of gravity. This, in effect, allowed Millikan to calculate the electrical charge of a single electron.

I was one of countless students who attempted this experiment in school, but in all honesty I was unable to get my results quite as neat as Millikan’s. There are a lot of details to get right – in particular, the experiment depends on correctly measuring the diameter of the tiny oil droplet. Mis-measure that, and all your other calculations will be off.

We now know that even Millikan didn’t get his answers quite as neat as he claimed he did. He systematically omitted observations that didn’t suit him, and lied about those omissions. (He also minimised the contribution of a junior colleague, Harvey Fletcher.) Historians of science argue about the seriousness of this cherry-picking, ethically and practically. What seems clear is that if the scientific world had seen all of Millikan’s results, it would have had less confidence that his answer was right. That would have been no bad thing, because it wasn’t. Millikan’s answer was too low.6

The charismatic Nobel laureate Richard Feynman pointed out in the early 1970s that the process of fixing Millikan’s error with better measurements was a strange one: ‘One is a little bit bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher. Why didn’t they discover the new number was higher right away?’7

The answer is that whenever a number was close to Millikan’s, it was accepted without too much scrutiny. When a number seemed wrong it would be viewed with scepticism. Reasons would be found to discard it. As we saw in the first chapter, our preconceptions are powerful things. We filter new information. If it accords with what we expect, we’ll be more likely to accept it.

And since Millikan’s estimate was too low, it would be rare to have a measurement that was so much lower as to be unexpected. Typically, the surprising measurements would be substantially larger than Millikan’s instead. Accepting them was a long and gradual process. It wasn’t helped by the fact that Millikan had discarded some of his measurements to make himself seem like a more accomplished scientist. But we can be confident that it would have happened anyway, because a later study found the same pattern of gradual convergence in other estimates of physical constants such as Avogadro’s number and Planck’s constant.* Convergence continued throughout the 1950s and 1960s and sometimes into the 1970s.8 It’s a powerful demonstration of the way that even scientists measuring essential and unchanging facts filter the data to suit their preconceptions.

This shouldn’t be entirely surprising. Our brains are always trying to make sense of the world around us, based on incomplete information. The brain makes predictions about what it expects, and tends to fill in the gaps, often based on surprisingly sparse data. That is why we can understand a routine telephone conversation on a bad line – until the point at which genuinely novel information such as a phone number or street address is being spoken through the static. Our brains fill in the gaps – which is why we see what we expect to see, and hear what we expect to hear, just as Millikan’s successors found what they expected to find. It is only when we can’t fill in the gaps that we realise just how bad the connection is.

We even smell what we expect to smell. When scientists give people a whiff of scent, the reactions differ sharply based on whether the scientists have told the experimental subjects ‘this is the aroma of a gourmet cheese’ or ‘this is the stink of armpits’.9 (It’s both: they’re smelling an aromatic molecule present in both runny cheese and bodily crevices.)

This process of sensing what you expect to sense is widespread. In the cheese study, it was visceral. In the case of the electron charge or Avogadro’s number, it was cerebral. In both cases, it seems to have been unconscious.

But we can also filter new information consciously, because we don’t want it to spoil our day. Back in the first chapter, we encountered students who’d pay not to have their blood tested for herpes and investors who avoided checking their stock portfolios when the news might be bad. Here’s another example – a study published in 1967, which asked undergraduates to listen to tape-recorded speeches and requested that they ‘judge the persuasiveness and sincerity of talks prepared by high school juniors and seniors . . . After each talk you will be given a rating sheet to rate the persuasiveness and sincerity of the speech.’

However, there was a catch. The talks were clouded with annoying static. The experimental subjects were told: ‘Since the talks were recorded on a small portable tape recorder there is considerable electrical interference. The interference can be “adjusted out” by pressing and then immediately releasing the control button. Use of the control several times in a row reduces somewhat the static and other interference noise.’10

Fine. Of course, as you can guess by now, the experiment involved some deception. Some of the undergraduates were committed Christians, and others were committed smokers. One of the talks was based on an old-school atheistic pamphlet titled Christianity Is Evil, another relied on ‘an authoritative refutation of the arguments linking smoking to lung cancer’, and a third spoke with similar authority about the fact that smoking did cause lung cancer.

As we’ve seen, all of us are capable of metaphorically filtering the information that comes our way, discarding some ideas and clinging on to others. In this experiment, the filter was more literal: static that obscured the messages the experimental subjects were supposed to listen to and evaluate. Pressing a button could remove the crackle and hiss – but not everyone enthusiastically mashed the button for every speech. It may not surprise you to hear that the Christians were content to leave the militant atheism behind a reassuring fog of static. Smokers pressed the button repeatedly to listen to the talk explaining that their habit was perfectly safe, while allowing the static to float back in when a different taped message told them unwelcome news.

One of the reasons facts don’t always change our minds is that we are keen to avoid uncomfortable truths. These days, of course, we don’t need to mess around with a static-reducing button. On social media we can choose who to follow and who to block. A vast range of cable channels, podcasts and streaming video lets us decide what to watch and what to ignore. We have more such choices than ever before, and you can bet that we’ll use them.

If you do have to absorb unwelcome facts, not to worry: you can always selectively misremember them. That was the conclusion of Baruch Fischhoff and Ruth Beyth, two psychologists who ran an elegant experiment in 1972. They conducted a survey in which they asked male and female students for predictions about Richard Nixon’s imminent presidential visit to China and the Soviet Union. How likely was it that Nixon and Mao Zedong would meet? What were the chances that the US would grant diplomatic recognition to China? Would the US and USSR announce a joint space programme?

Fischhoff and Beyth wanted to know how people would later remember their forecasts. They’d given their subjects every chance, since the forecasts were both specific and written down. (Usually our forecasts are rather vague prognostications in the middle of conversation. We rarely commit them to writing.) So one might have hoped for accuracy. But no – the subjects flattered themselves hopelessly. If they put some event at a 25 per cent likelihood, and then it happened, they might then remember they’d called it as a 50/50 proposition. If a subject had put a 60 per cent probability on an event which later failed to happen, she might later recall that she’d forecast a 30 per cent probability. The Fischhoff-Beyth paper was titled ‘I knew it would happen’.

It’s yet another striking illustration of how our emotions lead us to filter the most straightforward information – our own memory of an estimate we made not long ago, and went to the trouble of committing to paper.11 In some ways, this shows a remarkable mental flexibility. But rather than admit error and learn from it, Fischhoff and Beyth’s subjects were changing their own recollections to ensure that no painful reckoning with reality was required. As we’ve seen: admitting you’re wrong, then changing your view, is not an easy thing to do.

Of course, Irving Fisher wouldn’t have had to change his mind if he’d been right all along. Perhaps his real downfall was not the failure to adjust, but the failure to forecast accurately in the first place? Perhaps. It is certainly preferable to be right first time than to learn through painful experience. But the best studies we have of forecasting ability suggest that being right first time isn’t easy either.

In 1987, a young Canadian-born psychologist, Philip Tetlock, planted a time bomb under the forecasting industry that would not explode for eighteen years. Tetlock had been part of a rather grand project in which social scientists had been tasked with preventing nuclear war between the US and the USSR. As part of that project, he had interviewed many top experts to get their sense of what was happening in the Soviet Union, how the Soviets might respond to Ronald Reagan’s hawkish stance, what might happen next, and why.

But he found himself frustrated: frustrated by the fact that the leading political scientists, Sovietologists, historians and policy wonks had such contradictory views about what might happen next; frustrated by their refusal to change their minds in the face of contradictory evidence; and frustrated by the many ways in which even failed forecasts could be justified. Some predicted disaster, but were happy to rationalise the lack of catastrophe: ‘I was nearly right but fortunately it was Gorbachev rather than some neo-Stalinist who took over the reins.’ ‘I made the right mistake: far more dangerous to underestimate the Soviet threat than overestimate it.’ Or, of course, the get-out for all failed stock market forecasts, ‘Only my timing was wrong.’

Tetlock’s response was patient, painstaking and quietly brilliant. Following in the footsteps of Fischhoff and Beyth, but with more detail and on a much larger scale, he began to collect forecasts from almost three hundred experts, eventually accumulating 27,500 predictions. The main focus of the questions he asked was on politics and geopolitics, with a few from other areas such as economics thrown in. Tetlock sought clearly defined questions, enabling him with the benefit of hindsight to pronounce each forecast right or wrong. Then he simply waited while the results rolled in – for eighteen years.

Tetlock published his conclusions in 2005, in a subtle and scholarly book, Expert Political Judgment. He found that his experts were terrible forecasters. This was true in both the simple sense that the forecasts failed to materialise and in the deeper sense that the experts had little idea of how confident they should be in making forecasts in different contexts. It is easier to make forecasts about the territorial integrity of Canada than about the territorial integrity of Syria but, beyond the most obvious cases, the experts Tetlock consulted failed to distinguish the Canadas from the Syrias. Tetlock’s experts, like Fischhoff and Beyth’s amateurs, also dramatically misremembered their own forecasts, recalling some of their failures as things they’d been right about all along.12

Adding to the appeal of this tale of expert hubris, Tetlock found that the most famous experts made even less accurate forecasts than those outside the media spotlight. Other than that, the humiliation was evenly distributed. Regardless of political ideology, profession and academic training, experts failed to see into the future.

Most people, hearing about Tetlock’s research, simply conclude that either the world is too complex to forecast, or that experts are too stupid to forecast it, or both. But there was one person who kept faith in the possibility that even for intractable human questions of macroeconomics and geopolitics, a forecasting approach might exist that would bear fruit. That person was Philip Tetlock himself.

In 2013, on the auspicious date of 1 April, I received an email from Tetlock inviting me to join what he described as ‘a major new research programme funded in part by Intelligence Advanced Research Projects Activity, an agency within the US intelligence community’.

The core of the programme, which had been running since 2011, was a collection of quantifiable forecasts much like Tetlock’s long-running study. The forecasts would be of economic and geopolitical events, ‘real and pressing matters of the sort that concern the intelligence community – whether Greece will default, whether there will be a military strike on Iran, etc’. These forecasts took the form of a tournament with thousands of contestants; the tournament ran for four annual seasons.

‘You would simply log on to a website,’ Tetlock’s email continued, ‘give your best judgment about matters you may be following anyway, and update that judgment if and when you feel it should be. When time passes and forecasts are judged, you could compare your results with those of others.’

I did not participate. I told myself I was too busy; perhaps I was too much of a coward as well. But the truth is that I did not participate because, largely thanks to Tetlock’s work, I had concluded that the forecasting task was impossible.

Still, more than 20,000 people embraced the idea. Some could reasonably be described as having some professional standing, with experience in intelligence analysis, think-tanks or academia. Others were pure amateurs. Tetlock and two other psychologists, Barbara Mellers (Mellers and Tetlock are married) and Don Moore, ran experiments with the co-operation of this army of volunteers. Some were given training in some basic statistical techniques (more on this in a moment); some were assembled into teams; some were given information about other forecasts; while others operated in isolation. The entire exercise was given the name of the Good Judgment Project, and the aim was to find better ways to see into the future.

This vast project has produced a number of insights, but the most striking is that there was a select group of people whose forecasts, while by no means perfect, were vastly better than the dart-throwing-chimp standard reached by the typical prognosticator. What is more, they got better over time rather than fading away as their luck changed. Tetlock, with an uncharacteristic touch of hyperbole, called them ‘superforecasters’.

The cynics were too hasty: it is possible to see into the future after all.

What makes a superforecaster? Not subject-matter expertise: professors were no better than well-informed amateurs. Nor was it a matter of intelligence, otherwise Irving Fisher would have been just fine. But there were a few common traits among the better forecasters.

First, encouragingly for us nerds, it did help to have some training – of a particular kind. Just an hour of training in basic statistics improved the performance of forecasters by helping them turn their expertise about the world into a sensible probabilistic forecast, such as ‘the chance that a woman will be elected President of the US within the next ten years is 25 per cent’. The tip that seemed to help most was to encourage them to focus on something called ‘base rates’.13

What on earth are base rates? Well, imagine that you find yourself at a wedding, sitting at one of the back tables with the drunk schoolfriends of the groom or the disgruntled ex-boyfriend of the bride. (Yes, that sort of wedding.) At a tedious moment during one of the speeches, the conversation at your table turns to the distasteful question: will these two actually make it? Will the marriage last or is the relationship doomed to divorce?

The instinctive starting point is to think about the couple. It’s always hard to imagine divorce in the middle of the romance of a wedding day (although sharing a whisky with the bride’s ex-boyfriend may shake you out of that rosy glow) but you’d naturally ponder questions such as: ‘Do they seem happy and committed to each other?’; ‘Have I ever seen them argue?’; and ‘Have they split up and got back together three times already?’ In other words, we make a forecast with the facts that are in front of our nose.

But it is a better idea to zoom out and find one very straightforward statistic*: in general, how many marriages end in divorce? This number is known as the ‘base rate’. Unless you know whether the base rate is 5 per cent or 50 per cent, all the gossip you’re getting from the grumpy ex doesn’t fit into any useful framework.

The importance of the base rate was made famous by the psychologist Daniel Kahneman, who coined the phrase ‘the outside view and the inside view’. The inside view means looking at the specific case in front of you: this couple. The outside view requires you to look at a more general ‘comparison class’ of cases – here, the comparison class is all married couples. (The outside view needn’t be statistical, but it often will be.)

Ideally, a decision-maker or a forecaster will combine the outside view and the inside view – or, similarly, statistics plus personal experience. But it’s much better to start with the statistical view, the outside view, and then modify it in the light of personal experience than it is to go the other way around. If you start with the inside view you have no real frame of reference, no sense of scale – and can easily come up with a probability that is ten times too large, or ten times too small.

Second, keeping score was important. As Tetlock’s intellectual predecessors Fischhoff and Beyth had demonstrated, we find it challenging to do something as simple as remembering whether our earlier forecasts were right or wrong.

Third, superforecasters tended to update their forecasts frequently as new information emerged, which suggests that a receptiveness to new evidence was important. This willingness to adjust predictions is correlated with making better predictions in the first place: it wasn’t just that the super-forecasters beat the others because they were news junkies with too much time on their hands, prospering by endlessly tweaking their forecasts with each new headline. Even if the tournament rules had demanded a one-shot forecast, the superforecasters would have come top of the heap.

Which points to the fourth and perhaps most crucial element: superforecasting is a matter of having an open-minded personality. The superforecasters are what psychologists call ‘actively open-minded thinkers’ – people who don’t cling too tightly to a single approach, are comfortable abandoning an old view in the light of fresh evidence or new arguments, and embrace disagreements with others as an opportunity to learn. ‘For superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded,’ wrote Philip Tetlock after the study had been completed. ‘It would be facile to reduce superforecasting to a bumper-sticker slogan, but if I had to, that would be it.’14

And if even that is too long for the bumper sticker, what about this: superforecasting means being willing to change your mind.

The unfortunate Irving Fisher had struggled to change his mind. Not everyone had the same difficulty. The contrast with John Maynard Keynes is striking, despite the many similarities the two men shared. Keynes, like Fisher, was a colossal figure in economics. Like Fisher, he was a popular author, a regular newspaper commentator, a friend of powerful politicians, and a charismatic speaker. (After witnessing Keynes giving a speech, the Canadian diplomat Douglas LePan was moved to write, ‘I am spellbound. This is the most beautiful creature I have ever listened to. Does he belong to our species? Or is he from some other order?’)15 And like Fisher, Keynes was an enthusiastic participant in financial markets – founding an early hedge fund, dabbling in currency speculation, and managing a large portfolio on behalf of King’s College, Cambridge. His ultimate fate, however, was very different. The similarities and the contrasts between the two men are instructive.

Unlike Fisher, who had had to scramble for his success, Keynes was the ultimate insider. As a schoolboy Keynes was educated at Eton College – just like Britain’s first Prime Minister, and nineteen others since. Like his father, he became a senior academic: a Fellow of King’s College, the most spectacular of all the Cambridge colleges. His job during the First World War was managing both debt and currency on behalf of the British Empire; he’d barely turned thirty. He knew everyone. He whispered in the ear of Prime Ministers. He had the inside track on whatever was going on in the British economy – the Bank of England would even call him to give him advance notice of interest rate movements.

But this child of the British establishment was a very different person to his American counterpart. He loved fine wines and rich food; he gambled at Monte Carlo. His sex life was more like that of a 1970s pop star than a 1900s economist: bisexual, polyamorous, eventually settling down not with his childhood sweetheart but with a Russian ballerina, Lydia Lopokova. One of Keynes’s ex-boyfriends was the best man at their wedding.

He was adventurous in other ways, too. In 1918, for example, Keynes worked at the British Treasury. The First World War was still raging. The German army was camped outside Paris, shelling the city. But Keynes caught wind of the fact that, in Paris, the great French impressionist artist Edgar Degas was about to auction his vast collection of pieces by France’s greatest nineteenth-century painters: Manet, Ingres and Delacroix.16

And so Keynes launched an insane adventure. First, he persuaded the British Treasury, which was four years into fighting the most devastating war the planet had yet seen, to put together a fund for purchasing art of £20,000 – millions in today’s money. There was certainly a logic to the idea that it was a buyer’s market, but you’ve got to be pretty persuasive to free up funds from a wartime treasury to splurge on nineteenth-century French art.

Then, escorted by destroyers and a silver airship, Keynes crossed the Channel to France with the director of London’s National Gallery, who was wearing a fake moustache so that nobody recognised him. With the German artillery booming beyond the horizon, they showed up at the auction and cleared Degas out. The National Gallery got twenty-seven masterpieces at rock-bottom prices. Keynes even bought a few for himself.

After escaping back across the Channel, and exhausted after his adventures in Paris, Keynes showed up at the door of his friend Vanessa Bell and told her that he’d left a Cézanne outside in the hedge – could he please have a hand carrying it in? (Bell was the sister of the author Virginia Woolf and the lover of Keynes’s ex-boyfriend Duncan Grant, although she was married to someone else . . . Keynes’s social circle was complicated.) Keynes had got himself a bargain: these days a good Cézanne is worth a lot more than anything the National Gallery dared to purchase at the auction. But what Irving Fisher would have made of it all, I do not know.

At the end of the war, Keynes represented the British Treasury at the peace conference in Versailles. (He was disgusted at the outcome – and subsequent events proved him right.) Then, with currencies free-floating and volatile, Keynes set up what some historians describe as the first hedge fund to speculate on their movements. He raised capital from rich friends, and from his own father, to whom he made the not entirely reassuring comment, ‘Win or lose, this high-stakes gambling amuses me!’

Initially Keynes made money fast – over £25,000, even more than the art fund he’d wheedled out of the Treasury. His bet, in brief, was that the currencies of France, Italy and Germany would suffer in a bout of post-war inflation. In this he was broadly correct. Yet there’s an old saying, often attributed (without evidence) to Keynes himself: the market can stay wrong longer than you can stay solvent. A brief surge of optimism about Germany’s prospects wiped out Keynes’s fund in 1920. Undaunted, he went back to his investors. ‘I am not in a position to risk any capital myself, having quite exhausted my resources,’ he noted. But the spellbinding Keynes persuaded others to invest and his fund was back in profit by 1922.

One of Keynes’s next investment projects – he had several – concerned the portfolio of King’s College, Cambridge. Five centuries old, the college had long-standing rules on its investment policy, leaving it reliant on agricultural rents and very conservative investments such as railway bonds and government gilts. In 1921 the ever-persuasive Keynes convinced the college to change these rules to give him complete discretion over a significant slice of the college portfolio.

Keynes’s strategy for this money was top-down. He would forecast booms and recessions both in the UK and abroad, and invest in shares and commodities accordingly, moving across different sectors and countries depending on the macro-economic outlook.

Such an approach seemed to make sense. Keynes was the leading economic theorist in the country. He was receiving tips from the Bank of England. If anyone could call the ebb and flow of the British economy, it was John Maynard Keynes.

If.

Keynes, like Fisher, did not predict the great crash of 1929. Unlike Fisher, though, he recovered. Keynes died a millionaire, his reputation enhanced by his financial acumen. The reason is simple: Keynes, unlike Fisher, changed his mind, and his investment strategy.

Keynes had one advantage over Fisher: his track record as an investor had been painfully mixed. Yes, he had scored a remarkable coup in the art auction of 1918, and made a small fortune in the currency markets in 1922. But he had been wiped out in 1920, and his clever-seeming approach with the King’s College portfolio wasn’t working either. Over the course of the 1920s, Keynes’s attempts to forecast the business cycle had led him to trail the market as a whole by about 20 per cent. That is not a disaster, but it is certainly an indication that all is not well.

None of this helped Keynes see the great crash of 1929, but it did help him react to it. He had already been pondering his limitations as an investor, and wondering whether a different approach might pay off. When the crash hit, Keynes shrugged, and adjusted.

By the early 1930s, Keynes had abandoned business-cycle forecasting entirely. The greatest economist in the world had decided that he just couldn’t do it well enough to make money. It is a striking instance of humility from a man famous for his self-confidence. But Keynes had looked at the evidence and done something unusual: he’d changed his mind.

He moved instead to an investment strategy that required no great macroeconomic insight. Instead, he explained, ‘As time goes on, I get more and more convinced that the right method in investment is to put fairly large sums into enterprises which one thinks one knows something about and in the management of which one thoroughly believes.’ Forget what the economy is doing; just find well-managed companies, buy some shares, and don’t try to be too clever. And if that approach sounds familiar, it’s most famously associated with Warren Buffett, the world’s richest investor – and a man who loves to quote John Maynard Keynes.

Keynes is rightly viewed today as a successful investor. At King’s College, he recovered from the poor performance of the early years. When two financial economists, David Chambers and Elroy Dimson, recently studied Keynes’s track record with the King’s College portfolio, they found it to be excellent. Keynes secured high returns with modest risks, and outperformed the stock market as a whole by an average of six percentage points a year over a quarter of a century. That’s an impressive reward for being able to change your mind.17

It all sounds so simple: things are going badly, so do something different. Why, then, did Irving Fisher struggle to adapt?

Fisher’s first problem, ironically, was his successful track record. He was seriously wealthy by the end of the 1920s, having prospered in almost every endeavour he had attempted. As an investor, he had correctly predicted the productivity boom of the 1920s and correctly judged that the stock market would soar, and his leveraged bets on those judgements had paid off handsomely. Unlike Keynes, Fisher had received very little evidence of his own fallibility. It must have been hard for him to take in the scale of the financial bloodbath. It was all too tempting to write it off as a brief spasm of lunacy, which is what Fisher did.

In contrast, when the market crashed, Keynes was able to see it – and himself – for what it was. He’d been in crashes before, and lost heavily before. He was like a physicist who’d been forewarned that Robert Millikan’s research was flawed, so his estimates shouldn’t be taken too seriously; or perhaps like an experimental subject sniffing a test-tube after being told ‘this might be cheese, or it might be armpits, so think carefully’.

Fisher was vulnerable in a second way. He was constantly writing about his investment ideas, pinning his reputation to the idea that the stock market was on the up and up. There is a lot of vague prophecy in the forecasting business, so such public commitments are admirably honest. They are also dangerous. It wasn’t the concreteness of the predictions that was the problem. As we’ve seen, superforecasters tend to keep a careful record of their predictions. How else can they learn from their mistakes? No: it was the high public profile that made it harder for Fisher to change his mind.

One study of this, conducted by psychologists Morton Deutsch and Harold Gerard in 1955, asked college students to estimate the lengths of lines – a modification of the experiments conducted by Solomon Asch a few months previously, described in the sixth chapter. Some of the students did not write their estimates down. Others wrote their estimates down on an erasable pad, before erasing the result. Still others wrote their estimates down in permanent marker. As new information emerged, the students who had made this more public commitment were the least willing to change their minds.18

‘Kurt Lewin noticed [this effect] in the 1930s,’ says Philip Tetlock, referring to one of the founders of modern psychology. ‘Making public commitments “freezes” attitudes in place. So saying something dumb makes you a bit dumber. It becomes harder to correct yourself.’19

And Fisher’s commitment could hardly have been more public. Two weeks before the Wall Street crash began, he was reported by the New York Times as saying, ‘Stocks have reached what looks like a permanently high plateau.’ How do you back away from that?

Fisher’s third problem – perhaps the deepest – was his belief that, in the end, the future was knowable. ‘The sagacious businessman is constantly forecasting,’ he once wrote. Maybe. But contrast that with John Maynard Keynes’s famous view about long-term forecasts: ‘About these matters there is no scientific basis on which to form any calculable probability whatever. We simply do not know.’

Fisher, a man who was happy to specify the perfect angle for the out-turn of the foot, admire the rigour of the ‘clo’ unit of warming, and estimate the productivity gain from Prohibition, believed that with a sufficiently powerful statistical lens any problem would yield to the man of science. The statistical lens is indeed powerful. Still, I hope that I have convinced you that for any problem, it takes more than mere numbers to make the world add up.

Poor Irving Fisher had believed himself to be a man of logic and reason. He was a campaigner for education reform and the proven benefits of a vegetarian diet, and a student of ‘the science of wealth’. And yet he became the most famous financial basket-case in the country.

He kept thinking, and working, producing an incisive account of why the Depression had been so severe – including a painful reckoning with the effect of debt on the economy. But while his economic ideas are still respected today, he became a marginalised figure. He was deep in debt to the taxman and to his brokers, and towards the end of his life, a widower living alone in modest circumstances, he became an easy target for scam artists: he was always looking for the big financial break that would restore his fortune. The mansion was long gone. He avoided bankruptcy, and perhaps even prison, because his late wife’s sister covered his debts to the value of tens of millions of dollars in today’s terms. It was a kindness, but for the proud Professor Fisher it must have been the ultimate humiliation.

The economic historian Sylvia Nasar wrote of Fisher that ‘His optimism, overconfidence and stubbornness betrayed him.’20 Keynes had plenty of confidence too, but he had also learned the hard way that there are certain facts about the world that do not easily yield to logic. Recall his comment to his father – ‘this high-stakes gambling amuses me’. The Monte Carlo gambler knew, all along, that while investing was a fascinating game, it was a game nonetheless, and one should not take an unlucky throw of the dice too much to heart. When his early investment ideas failed, he tried something else. Keynes was able to change his mind; Fisher, alas, could not.

Fisher and Keynes died within a few months of each other, not long after the end of the Second World War. Fisher was a much-diminished figure; Keynes was the most influential economist on the planet, fresh from shaping the World Bank, the IMF and the entire global financial system at the Bretton Woods conference in 1944.

Late in his life, Keynes reflected, ‘My only regret is that I have not drunk more champagne in my life.’ But he is remembered far more for words that he probably never said. Nevertheless, he lived by them: ‘When my information changes, I alter my conclusions. What do you do, sir?’

If only he had taught that lesson to Irving Fisher.

Fisher and Keynes were equally expert, and they had the same statistical information at their fingertips – data they themselves had done much to collect. Just as with Abraham Bredius, the art scholar so cruelly tricked by the forger Han van Meegeren, their fates were determined not by their expertise but by their emotions.

This book has argued that it is possible to gather and to analyse numbers in ways that help us understand the world. But it has also argued that very often we make mistakes not because the data aren’t available, but because we refuse to accept what they are telling us. For Irving Fisher, and for many others, the refusal to accept the data was rooted in a refusal to acknowledge that the world had changed.

One of Fisher’s rivals, an entrepreneurial forecaster named Roger Babson, explained (not without sympathy) that while Fisher was ‘one of the greatest economists in the world today and a most useful and unselfish citizen’, he had failed as a forecaster because ‘he thinks the world is ruled by figures instead of feelings’.21

I hope that this book has persuaded you that it is ruled by both.

___________

* I’ll spare you my efforts to define these physical constants. For our purposes what matters is that they are hard to measure precisely, and that each attempt to improve the accuracy of these measurements seems to have been systematically swayed by previous attempts.

* It’s naughty of me to call this a ‘straightforward’ statistic. In the UK, according to the Office for National Statistics (Statistical Release, 29 November 2019), 22 per cent of 1965 marriages had ended in divorce by 1985. That figure has risen over time: 38 per cent of 1995 marriages had ended in divorce by 2015. There is now evidence that the divorce rate is falling again – but it is obviously too early to say how many recent marriages will last twenty years. Evidently it’s a matter of judgement – and the available data – as to which base rate you think is relevant. All UK marriages? All recent marriages? All marriages between people of a certain age, or education level? It’s not straightforward at all, if I am honest. But it is better to try to find a relevant base rate and reason from there, than to pull numbers out of your brain without any context.