Decision making
Daniel Ellsberg (1931–)
1921 US economist Frank Knight explains that “risk” can be quantified and “uncertainty” cannot.
1954 In The Foundations of Statistics, US mathematician L. J. Savage tries to show how probabilities can be assigned to unknown future events.
From 1970s Behavioral economics uses experiments to study behavior under conditions of uncertainty.
1989 Michael Smithson proposes a “taxonomy” of risk.
2007 Nassim Nicholas Taleb’s The Black Swan discusses the problem of rare, unforeseen events.
By the 1960s mainstream economics had settled on a set of principles for understanding people’s decision making. Human beings are rational, calculating individuals. When confronted with different options and an uncertain future, they assign a probability to each possible future outcome and make their choice accordingly. They seek to boost their “expected utility” (the amount of satisfaction they expect) based on their beliefs about the probability of different future outcomes, opting for the choice with the highest expected utility.
But this set of ideas was challenged by results suggesting that, even under experimental conditions, humans do not behave according to the theory. One of the most important of these challenges was posed in the Ellsberg paradox, popularized by US economist Daniel Ellsberg in 1961, but drawing on an idea originally described by John Maynard Keynes in the 1930s.
Ellsberg described a thought experiment in which a cash prize was offered if a ball of a particular color was drawn from an imaginary urn. The bets made by the experiment’s participants demonstrated that people tend to make a reasoned choice when given some information from which the degree of probability, and therefore risk, can be assessed. However, their behavior changes if a future outcome seems ambiguous, and this is the paradox that departs from expected utility theory. People prefer to know more about the uncertainties they face, rather than less. In the words of former US Defense Secretary Donald Rumsfeld (1932–), people prefer the “known unknowns” to the “unknown unknowns.” The outcome of the experiment has been reproduced in several real experiments since Ellsberg published his paper. It has become known as “ambiguity aversion,” and sometimes “Knightian uncertainty” after the US economist Frank Knight. In seeking to know more about “unknown unknowns,” people may act inconsistently with previous, more logical choices, and put questions of probability aside when making their choice.
Ellsberg’s paradox has proved controversial. Some economists claim that it can safely be contained within conventional theory, and that experimental conditions do not properly reproduce people’s behavior when faced with real-life ambiguity. However, the financial crisis of 2008 has provoked fresh interest in the problem of ambiguity. People want to know more about the unknown, unquantifiable risks that expected utility theory cannot account for.
Born in 1931, Daniel Ellsberg studied economics at Harvard University, and joined the US Marine Corps in 1954. In 1959, he became an analyst for the White House. He received his PhD in 1962, in which he first presented his paradox. Ellsberg, then working with top security clearance, became disillusioned with the Vietnam War. In 1971, he leaked top secret reports detailing the Pentagon’s belief that the war could not be won, before handing himself over to the authorities. His trial collapsed when it was revealed that White House agents had used illegal wiretaps of his house.
1961 Risk, Ambiguity, and the Savage Axioms
2001 Risk, Ambiguity, and Decision