Chapter 2
Expectations and Fair Values

Let's say that you are offered the following bet: you pay $1, then a coin is flipped. If the coin comes up tails you lose your money. On the other hand, if it comes up heads, you get back your dollar along with a 50 cents profit. Would you take it?

Your gut feeling probably tells you that this bet is unfair and you should not take it. As a matter of fact, in the long run, it is likely that you will lose more money than you could possibly make (because for every dollar you lose, you will only make a profit of 50 cents, and winning and losing have the same probability). The concept of mathematical expectation allows us to generalize this observation to more complex problems and formally define what a fair game is.

2.1 Random Variables

Consider an experiment with numerically valued outcomes c02-math-001. We call the outcome of this type of experiment a random variable, and denote it with an uppercase letter such as c02-math-002. In the case of games and bets, two related types of numerical outcomes arise often. First, we consider the payout of a bet, which we briefly discussed in the previous chapter.

The payout of a bet is the amount of money that is awarded to the player under each possible outcome of a game.

The payout is all about what the player receives after game is played, and it does not account for the amount of money that a player needs to pay to enter it. An alternative outcome that addresses this issue is the profit of a bet:

The profit of a bet is the net change in the player's fortune that results from each of the possible out comes of the game and is defined as the payout minus the cost of entry.

equation

Note that, while all payouts are typically nonnegative, profits can be either positive or negative.

For example, consider the bet that was offered to you in the beginning of this chapter. We can define the random variable

equation

As we discussed earlier, this random variable represents how much money a player receives after playing the game. Therefore, the payout has only two possible outcomes c02-math-003 and c02-math-004, with associated probabilities c02-math-005 and c02-math-006. Alternatively, we could define the random variable

equation

which represents the net gain for a player. Since the price of entry to the game is $1, the random variable c02-math-007 has possible outcomes c02-math-008 (if the player loses the game) and c02-math-009 (when the player wins the game), and associated probabilities c02-math-010 and c02-math-011.

2.2 Expected Values

To evaluate a bet, we would like to find a way to summarize the different outcomes and probabilities into a single number. The expectation (or expected value) of a random value allows us to do just that.

The expectation of a random variable X with outcomes c02-math-012 is a weighted average of the outcomes, with the weights given by the probability of each outcome:

equation

For example, the expected payout of our initial wager is

equation

On the other hand, the expected profit from that bet is c02-math-013.

We can think about the expected value as the long-run “average” or “representative” outcome for the experiment. For example, the fact that c02-math-014 means that, if you play the game many times, for every dollar you pay, you will get back from the house about 75 cents (or, alternatively, that if you start with $1000, you will probably end up with only about $750 at the end of the day). Similarly, the fact that c02-math-015 means that for every $1000 you bet you expect to lose along $250 (you lose because the expected value is negative). This interpretation is again justified by the law of large numbers:

Law of Large Numbers for Expectations (Law of Averages)

Let c02-math-016 represent the average outcome of c02-math-017 repetitions of a random variable c02-math-018 with expectation c02-math-019. Then c02-math-020 approaches c02-math-021 as c02-math-022 grows.

The following R code can be used to visualize how the running average of the profit associated with our original bet approaches the expected value by simulating the outcome of 5000 such bets and plotting it (see Figure 2.1):[

c02uf004
Graphical illustration of Running profits from a wager that costs $1 to join and pays nothing if a coin comes up tails and $1.50 if the coin comes up tails.

Figure 2.1 Running profits from a wager that costs $1 to join and pays nothing if a coin comes up tails and $1.50 if the coin comes up tails (solid line). The gray horizontal line corresponds to the expected profit.

The expectation of a random variable has some nifty properties that will be useful in the future. In particular,

If X and Y are random variables and a, b and c are three constant (non-random numbers), then

equation

To illustrate this formula, note that for the random variables c02-math-023 and c02-math-024 we defined in the context of our original bet, we have c02-math-025 (recall our definition of profit and payout minus price of entry). Hence, in this case, we should have c02-math-026, a result that you can easily verify yourself from the facts that c02-math-027 and c02-math-028.

2.3 Fair Value of a Bet

We could turn the previous calculation on its head by asking how much money you would be willing to pay to enter a wager. That is, suppose that the bet we proposed in the beginning of this chapter reads instead: you pay me $ c02-math-029, then I flip a coin. If the coin comes up tails, I get to keep your money. On the other hand, if it comes up heads, I give you back the price of bet c02-math-030 along with a 50 cents profit. What is the highest value of c02-math-031 that you would be willing to pay? We call the value of c02-math-032 the fair value of the bet.

Since you would like to make money in the long run (or, at least, not lose money), you would probably like to have a nonnegative expected profit, that is, c02-math-033, where c02-math-034 is the random variable associated with the profit generated by the bet described earlier. Consequently, the maximum price you would be willing to pay corresponds to the price that makes c02-math-035 (i.e., a price such that you do not make any money in the long term, but at least not lose any either). If the price of the wager is c02-math-036, then the expected profit of our hypothetical wager is

equation

Note that c02-math-037 if and only if c02-math-038, or equivalently, if c02-math-039. Hence, to participate in this wager you should be willing to pay any amount equal or lower than the fair value of 50 cents. A game or bet whose price corresponds to its fair value c02-math-040 is called a fair game or a fair bet.

The concept of fair value of a bet can be used to provide an alternative interpretation of a probability. Consider a bet that pays $1 if event c02-math-041 happens, and 0 otherwise. The expected value of such a bet is c02-math-042, that is, we can think of c02-math-043 as the fair value of a bet that pays $1 if c02-math-044 happens, and pays nothing otherwise. This interpretation is valid no matter whether the event can be repeated or not. Indeed, this interpretation of probability underlies prediction markets such as PredictIt (https://www.predictit.org) and the Iowa Electronic Market (http://tippie.biz.uiowa.edu/iem/). Although most prediction markets are illegal in the United States (where they are considered a form of online gambling), they do operate in other English-speaking countries such as England and New Zealand.

2.4 Comparing Wagers

The expectation of a random variable can help us compare two bets. For example, consider the following two wagers:

  • Wager 1: You pay $1 to enter and I roll a die. If it comes up 1, 2, 3, or 4 then I pay you back 50 cents and get to keep 50 cents. If it comes up 5 or 6, then I give you back your dollar and give you 50 cents on top.
  • Wager 2: You pay $1 to enter and I roll a die. If it comes up 1, 2, 3, 4, or 5 then I return to you only 75 cents and keep 25 cents. If it comes up 6 then I give you back your dollar and give you 75 cents on top.

Let c02-math-045 and c02-math-046 represent the profits generated by each of the bets above. It is easy to see that, if the dice are fair,

equation

These results tell you two things: (1) both bets lose money in the long term because both have negative expected profits; (2) although both are disadvantageous, the second is better than the first because it is the least negative.

You can verify the results by simulating 2000 repetitions of each of the two bets using code that is very similar to the one we used in Section 2.2 (see Figure 2.2, as well as Sidebar 2.1 for details on how to simulate outcomes from nonequiprobable experiments in R).[

c02uf005
Graphical illustration of Running profits fromWagers 1 (continuous line) and 2 (dashed line).

Figure 2.2 Running profits from Wagers 1 (continuous line) and 2 (dashed line).

Note that, although early on the profit from the first bet is slightly better than the profit from the second, once you have been playing both bets for a while the cumulative profits revert to being close to the respective expected values.

Consider now the following pair of bets:

The expectations associated with these two bets are

equation

So, both bets are fair, and the expected value does not help us choose among them. However, clearly these bets are not identical. Intuitively, the first one is more “risky”, in the sense that the probability of losing our original bet is larger. We can formalize this idea using the notion of variance of a random variable:

The variance of a random variable X with outcomes c02-math-047 is given by

equation

As the formula indicates, the variance measures how far, on average, outcomes are from the expectation. Hence, a larger variance reflects a bet with more extreme outcomes, which often translates into a larger risk of losing money. For instance, for wagers 3 and 4, we have

equation

which agrees with our initial intuition. Figure 2.3 shows the running profit for 2000 simulations of each of the two wagers. As expected, the more variable wager 3 oscillates more wildly and takes longer than the less variable wager 4 to get close to the expected value of 0.[

c02uf006
Graphical illustration of Running profits fromWagers 3 (continuous line) and 4 (dashed line).

Figure 2.3 Running profits from Wagers 3 (continuous line) and 4 (dashed line).

Just like the expectation, the variance has some interesting properties. First, the variance is always a nonnegative number (a variance of zero corresponds to a nonrandom number). In addition,

If X is a random variable and a and b are two constant (non-random numbers), then

equation

A word of caution is appropriate at this point. Note that a larger variance implies not only a higher risk of losing money but also the possibility of making more money in a single round of the game (the maximum profit from wager 3 is actually twice the maximum profit from wager 4). Therefore, if you want to make money as fast as possible (rather than play for as long as you can), you would typically prefer to take an additional risk and go for the bet with the highest variance!

2.5 Utility Functions and Rational Choice Theory

The discussion about the comparison of bets presented in the previous section is an example of the application of rational choice theory. Rational choice theory simply states that individuals make decisions as if attempting to maximize the “happiness” (utility) that they derive from their actions. However, before we decide how to get what we want, we first need to decide what we want. Therefore, the application of the rational choice theory comprises two distinct steps:

  1. 1. We need to define a utility function, which is simply a quantification of a person's preferences with respect to certain objects or actions.
  2. 2. We need to find the combination of objects/actions that maximizes the (expected) utility.

For example, when we previously compared wagers, our utility function was either the monetary profit generated by the wager (in our first example) or a function of the variance of the wager (when, as the second example, the expected profit from all wagers was the same). However, finding appropriate utility functions for a given situation can be a difficult task. Here are some examples:

  1. 1. All games in a casino are biased against the players, that is, all have a negative expected payoff. If the player's utility function were based only on monetary profit, nobody would gamble! Hence, a utility function that justifies people's gambling should include a term that accounts for the nonmonetary rewards associated with gambling.
  2. 2. When your dad used to play cards with you when you were five years old, his goal was probably not to win but to entertain you. Again, a utility function based on money probably makes no sense in this case.
  3. 3. The value of a given amount of money may depend on how much money you already have. If you are broke, $10,000 probably represents a lot of money, and you would be unwilling to take a bet that would make you lose that much, even if the expected profit were positive. On the other hand, if you are Warren Buffett or Bill Gates, taking such a bet would not be a problem.

In this book, we assume that players are only interested in the economic profit and that the fun they derive from it (the other component of the utility function) is large enough to justify the possibility of losing money when playing. In addition, we will assume that players are risk-averse, so among bets that have the same expected profit, we will prefer those that have lower variances. For this reason, in this book, we will usually look at the expected value of the game first and, if the expected value happens to be the same for two or more choices, we will expect the player pick the one with the lower variance (which, as we discussed before, minimizes the risk).

2.6 Limitations of Rational Choice Theory

Rational choice theory, although useful to formulate models of human behavior, is not always realistic. A good example of how people will easily deviate from the strict rational behavior as defined above is Ellsberg's Paradox. Assume that you have an urn that contains 100 blue balls and 200 balls of other colors, some of which are black and some of which are yellow (exactly how many are of each color is unknown). First, you are offered the following two wagers:

  • Wager 1: You receive $10 if you draw a blue ball and nothing otherwise.
  • Wager 2: You receive $10 if you draw a black ball and nothing otherwise.

Which of the two wagers would you prefer? After answering this question, you are offered the following two wagers,

  • Wager 3: You receive $10 if you draw a blue or yellow ball and nothing otherwise.
  • Wager 4: You receive $10 if you draw a black or yellow ball and nothing otherwise.

No matter how many yellow balls there really are, rational choice theory (based on calculating expected values for each wager) predicts that if you prefer Wager 2 to Wager 1, then you should also prefer Wager 4 to Wager 3, and vice versa. To see this, note that the expected payoff from Wager 1 is c02-math-048 (because there are exactly 100 blue balls in the urn). Consequently, for Wager 2 to be preferable to wager 1, you would need to assume that the urn contains more than 100 black balls. But if you assume that there are at least 100 black balls in the urn, the expected value for Wager 3 would be at most c02-math-049 (because there are at most 99 yellow balls and exactly 100 blue ball in the urn), while the expected profit for Wager 4 would always be c02-math-050, making Wager 4 always better than Wager 3. The paradox arises from the fact that many people who prefer Wager 1 to Wager 2 actually prefer Wager 4 to Wager 3. This might be because people do not know how to react to the uncertainty of how many black and yellow balls there are and prefer the wagers where there is less (apparent) uncertainty.

Another interesting example is Allais paradox. Consider three possible prizes – prize A: $0, prize B: $1,000,000, and prize C: $5,000,000. You are first asked to choose among two lotteries:

  • Lottery 1: You get prize B ($1,000,000) for sure.
  • Lottery 2: You get prize A (nothing) with probability 0.01, you get prize B ($1,000,000) with probability 0.89, or you get prize C ($5,000,000) with probability 0.10.

Then you are offered a second set of lotteries

  • Lottery 3: You get prize A (nothing) with probability 0.89, or you get prize B ($1,000,000) with probability 0.11.
  • Lottery 4: You get prize A (nothing) with probability 0.90, or you get prize C ($5,000,000) with probability 0.10.

Again, many subjects report that they prefer Lottery 1 to Lottery 2 and Lottery 4 to Lottery 3, although rational choice theory predicts that the persons who choose Lottery 1 should choose Lottery 3 too.

The Allais paradox is even subtler than Elsberg's paradox, because each wager (by itself) has an obvious choice (1 and 4, respectively), but taking the two wagers together, if you choose option 1 in the first wager, you should rationally choose option 3 in the second wager because they are essentially the same option. The way we make sense of this (talk about a paradox!) is by noticing that Lottery 1 can be seen as 89% of the time winning $1 million and the remaining 11% winning $1 million. We look at Lottery 1 in this unusual way because it will be easier to compare it to Lottery 3 (where we win nothing 89% of the time and $1 million 11% of the time). We can change the way we look at Lottery 4 for the same reason (to better compare it to Lottery 2): we win nothing 89% of the time, nothing another 1% of the time, and $5 million 10% of the remaining time. Table 2.1 summaries this alternative description for the lotteries.

Table 2.1 Winnings for the different lotteries in Allais paradox

Lottery 1 Lottery 2 Lottery 3 Lottery 4
Wins $1 million 89% of the time Wins $1 million 89% of the time Wins nothing 89% of the time Wins nothing 89% of the time
Wins $1 million 11% of the time Wins nothing 1% of the time Wins $1 million 11% of the time Wins nothing 1% of the time
Wins $5 million 10% of the time Wins $5 million 10% of the time

You can see that Lotteries 1 and 2 are equivalent 89% of the time (they both give you $1 million) and that Lotteries 3 and 4 are the same also 89% of the time (they give you nothing). Let's look at the table if we cross out the row corresponding to what is supposed to happen 89% of the time.

Table 2.2 Winnings for 11% of the time for the different lotteries in Allais paradox

Lottery 1 Lottery 2 Lottery 3 Lottery 4
Wins $1 million 11% of the time Wins nothing 1% of the time Wins $1 million 11% of the time Wins nothing 1% of the time
Wins $5 million 10% of the time Wins $5 million 10% of the time

In Table 2.2, we see very clearly that Lotteries 1 and 3 are the same choice and that Lotteries 2 and 4 are the same choice too. Hence, the conclusion from this paradox is that by adding winning $1 million 89% of the time in the first wager compared to the second wager, people deviate from the rational choice across wagers even though there's no reason to do so.

The bottom line from these two paradoxes is that, although rational choice is a useful theory that can produce interesting insights, some care needs to be exercised when applying those insights to real life problems, because it seems that people will not necessarily make “rational” choices.

2.7 Exercises

  1. 1. Use the definition of rational choice theory to discuss in which sense gambling can be considered “rational” or “irrational.”

  2. 2. Using the basic principles of the “rational player” described in the text (mainly that a player will always try to maximize its expected value and secondly minimize the variance of the gains), decide which of wagers below would the player choose. In all the wagers, the player is required to pay $1 to enter the wager.

    1. Wager 1: When you flip a coin and heads comes out, you lose your dollar. If tails comes out, you get your dollar back and get an additional $0.25.
    2. Wager 2: When you roll a die and 1 or 2 comes out, you lose your dollar; if a 3 or a 4 comes out, you get your dollar back; and if a 5 or a 5 comes out, you get your dollar back and an additional $0.50.
    3. Wager 3: If you roll a die and 1, 2, or 3 comes out, you lose your dollar. If a 4 comes out, you receive your dollar back, and if a 5 or a 6 comes out, you receive your dollar back and an additional $0.50.
  3. 3. The values of random variables are characterized by their random variability. Explain in your own words what aspect of that variability is the expected value trying to capture. What aspect of the random variability is the variance trying to capture?

  4. 4. If you are comparing the variance of two different random variables and you find out one is much higher than the other, what does that mean?

  5. 5. Does high variability in the profit of a wager mean higher risk or lower risk of losses?

  6. 6. The expected profit for a new game with price $1 is c02-math-051 cents. If you repeatedly bet $5 for 1000 times, would you expect to win or lose money at the end of the night? How much?

  7. 7. Comment on the following statement: “A rational player will always choose a wager with high variability because it allows for higher gains.”

  8. 8. Consider the three different stocks and their profits. Which one would a rational player choose?

    1. Stock A: This stock will give you a net profit of $100 with a probability 0.8, a net profit of c02-math-052$150 with probability 10%, a net profit of $200 with probability 5% or net profit of c02-math-053$500 with probability 5%.
    2. Stock B: This stock will give you a net profit of $65 with a probability 0.8, a net profit of c02-math-054$15 with probability 10%, a net profit of $40 with probability 5% or net profit of c02-math-055$50 with probability 5%.
    3. Stock C: This other stock will give you a net profit of $100 with a probability 0.5, a net profit of c02-math-056$150 with probability 20%, a net profit of $200 with probability 15% or net profit of c02-math-057$500 with probability 15%.
  9. 9. Rank your preferences for the following four lotteries (all cost $1 to enter). Explain your choices:

    • L1: Pays 0 with probability 1/2 and $40,000 with probability 1/2.
    • L2: Pays 0 with probability 1/5 and $25,000 with probability 4/5.
    • L3: Pays c02-math-058$10,000 (so you need to pay $10,000 if you lose!!) with probability 1/2 and $50,000 with probability 1/2.
    • L4: Pays $10 with probability 1/3 and $30,000 with probability 2/3.
  10. 10. Let's say you finally got some money to buy a decent car. You have two alternatives: alternative c02-math-059 corresponds to buying a 10-old Corolla and alternative c02-math-060 corresponds to buying a brand new Corolla. Each alternative has different types of costs involved (the initial cost of the car and future maintenance costs).

    • For option c02-math-061, there is a 80% probability that on top of the cost of $10,000 to buy the car we will have a $2000 cost for major work on the car in the future. There is also a probability of 15% for the future costs to be as high as $3000 (for a total cost of $13,000). Finally, the more unlucky are subject to the probability of 5% that future costs be as high as $5000 (for a total cost of $15,000).
    • For option c02-math-062, there's a pretty high probability (90%) that there are no major costs in maintaining the car in the future and you are subject to just the cost of buying the car ($20,000). There is some chance (say 5%), though, that you might need a new transmission or other major work (say, involving $1000 in costs). There's a smaller probability (3%) of some more serious work being necessary (say something around $2000). And, for the really unlucky, a 2% chance one might need some serious work done (costing something like $3000).

    Which one of the two choices would one rationally recommend and why?

  11. 11. An urn contains 30 yellow balls and 70 balls of other colors (which can be either red or blue). Suppose you are offered the following two bets:

    • Wager 1: You receive $10 if you draw a yellow ball.
    • Wager 2: You receive $10 if you draw a blue ball.
    If you prefer the second bet over the first, which one of the following two wagers would you prefer if you are a rational player?
    • Wager 3: You receive $20 if you draw a red ball.
    • Wager 4: You receive $15 if you draw a yellow or blue ball.
  12. 12. [R] Simulate the profit of both pairs of wagers in the previous exercise and plot the results to see if you made the right decision.

  13. 13. A certain health condition has two possible treatments

    • Treatment A, if successful, will extend the lifetime of the patient by 36 months. If it fails, it will neither increase nor reduce the expected lifetime of the patient. Clinical trials show that 20% of patients respond to this treatment.
    • Treatment B, if successful, will increase the lifetime of the patient by 14 months, and 65% of the patients respond to it. In addition, 10% of the patients subject to this treatment suffer from an adverse reaction that reduces their expected lifetime by 2 months, and for the rest (25%) the treatment has no effect.
    Which one of the two treatments would you recommend, and why? Are there any circumstances under which you would recommend the other treatment? Consider both the point of view of the doctor making the recommendation and that of the patient receiving the treatment.