I want to share a very cool puzzle that combines several ideas from the past few chapters and adds a twist.
I’m not sure where this puzzle originated. I heard it first from Professor John Conley of Vanderbilt University, and I’ve often assigned it to my classes. Google has used it to weed out job candidates. The answer that Google expects is the same answer I gave John Conley and the answer I usually get from my best students. But it turns out there’s more to be said.
Oh, by the way—although this is the first problem of the chapter, I’m calling it problem 10, not problem 1. Don’t worry for now about why.
Ten ferocious pirates have come into possession of 100 gold coins, which they must divide among themselves. The procedure for dividing them is as follows:
The most ferocious pirate proposes a division (such as “I’ll take fifty, the second-most-ferocious pirate gets ten, and everyone else gets five apiece”).
The pirates vote on whether to accept the division. As long as half or more of the pirates vote yes, the division is accepted, the coins are divided, and the game is over.
If, instead, fewer than half the pirates vote yes, the most ferocious pirate is thrown overboard. The second-most-ferocious pirate now becomes the most ferocious pirate and we return to step 1.
Each pirate’s first priority is to avoid being thrown overboard; his second priority is to amass as many coins as possible, and his third priority is to have the pleasure of seeing other pirates thrown overboard.
The pirates are assumed to be perfectly rational and fully aware of one another’s strategies.
What happens?
SOLUTION:
This is a perfect opportunity to apply the backward induction technique that we first met in chapter 9. We’ll start with some simpler problems:
Problem 1:
Suppose there’s just one ferocious pirate. Call him James. Obviously, James allocates 100 coins to himself, votes for his own plan, and wins.
Problem 2:
Suppose there are two ferocious pirates—Igor and James, with Igor the most ferocious. Now Igor has half the votes and can therefore impose any plan he wants to. He therefore allocates 100 coins to himself, votes for his own plan, and wins. James gets nothing.
Problem 3
adds a third pirate—Howard, who is more ferocious even than Igor. If Howard wants to avoid being thrown overboard, he needs at least two votes for his plan. He’s got his own vote, so he’s got to buy a vote from either Igor or James—both of whom realize that if Harold goes overboard, they’ll be playing the game from problem 2. Igor likes that game so much that he can’t be bought. James, on the other hand, foresees that if we end up in problem 2, he’ll get 0—and can therefore be bought for a single coin. So Howard’s proposal is 99 for Howard, 0 for Igor, 1 for James. Howard and James vote yes and the plan is approved.
Problem 4:
Now let’s add a fourth pirate, the even more ferocious George. To pass a plan, George needs two votes—his own and someone else’s. If his plan fails, we’ll be in the world of problem 3, where Howard gets 99 coins, Igor gets 0, and James gets 1. Therefore it would take 100 coins to buy Howard’s vote, it would take 1 coin to buy Igor’s, and it would take 2 coins to buy James’s. Igor’s vote is the cheapest, so that’s the one George buys. George’s proposal, then, is 99 for George, 0 for Howard, 1 for Igor, 0 for James. George and Igor vote yes, and the plan is approved.
Problem 5:
If there’s a fifth and even more ferocious pirate—call him Freddy—then Freddy needs three out of five votes to pass his plan. In addition to his own vote, he’ll buy the two cheapest votes available. In view of what happens in problem 4, George’s price is 100, Howard’s is 1, Igor’s is 2, and James’s is 1. Therefore Freddy buys Howard’s and James’s votes for 1 coin each. Freddy gets 98, George gets 0, Howard gets 1, Igor gets 0, James gets 1.
And so on (you can fill in problems 6, 7, 8, and 9 yourself!) until we reach
Problem 10 (the original problem!):
With 10 pirates, Arlo (the most ferocious) gets 96, Bob gets 0, Charlie gets 1, Dave gets 0, Eeyore gets 1, Freddy gets 0, George gets 1, Howard gets 0, Igor gets 1, and James gets 0. Charlie, Eeyore, George, and Igor all join Arlo in voting yes, because if they vote no, Bob will propose a division in which they get nothing.
Conclusion: The only possible outcome is 96-0-1-0-1-0-1-0-1-0.
I want to repeat that:
Main Conclusion: 96-0-1-0-1-0-1-0-1-0 is the only possible outcome.
That’s the solution I’ve always wanted from my students and apparently the solution that Google has always wanted from its job candidates. Unfortunately, there’s a fly in the ointment: the main conclusion is false. There are plenty of other solutions. Here’s one:
ANOTHER SOLUTION:
The most ferocious pirate proposes to take all the coins, leaving everyone else with nothing. The other nine all want this proposal to fail. Nevertheless, they all vote yes and the proposal is accepted.
Wait a minute! What happened to the rationality assumption? Why, for example, would Bob—the second-most-ferocious pirate—ever vote in favor of a division that leaves him with 0 coins?
The answer is that Bob, being perfectly rational, is aware that his vote doesn’t matter. As long as everyone else votes yes, the proposal is sure to pass. So his yes vote, which doesn’t change the outcome, is perfectly rational. (A no vote would be equally rational.)*
Another possible solution is that one of the pirates (perhaps even the most ferocious pirate!) votes no, while the rest vote yes. Or three vote no while seven vote yes.
In each of these cases no single voter can change the outcome, so there’s no particular reason for anyone to change his vote. In other words, everyone is being perfectly rational, taking as given the choices made by everyone else.*
To highlight the point, here’s one more problem:
The 2040 US presidential election has come down to Justin Bieber versus Miley Cyrus.* It happens that every single voter prefers Cyrus, and all voters are perfectly rational. Who wins?
SOLUTION:
It’s quite impossible to say. Bieber could quite plausibly get 100 percent of the votes. After all, if you’re aware that everyone else is voting for Bieber anyway, then you know Bieber is sure to win. In that case, your vote can’t change the outcome, so you might as well vote for Bieber along with everyone else. (You also, of course, might as well vote for Cyrus.) There is nothing irrational about voting for the candidate you hate, provided you’re quite sure your vote won’t change the outcome.
Of course this solution depends very heavily on the additional assumption that voters are fully aware of one another’s intentions, and hence fully aware that no voter can change the outcome. But it is a solution to the problem as stated, which certainly does not rule out this additional assumption.
Back to the pirates now. We have a problem here.
Our main conclusion tells us that the only possible outcome is 96-0-1-0-1-0-1-0-1-0, with all of the odd-numbered pirates voting yes. But that’s not true! We’ve in fact discovered a great many additional outcomes.
So we’re forced to ask the following question:
Where is the mistake in the solution to The Ferocious Pirates?
SOLUTION:
That solution (as you’ll see if you flip back a couple of pages and reread it carefully) makes this implicit (and quite unwarranted) assumption:
Pirates always vote for their preferred outcomes.
Given that assumption, the argument really is airtight and the main conclusion is correct.
But we have nothing to justify that assumption. From the statement of the problem, we can’t predict anything about how pirates behave in cases where their votes don’t matter. And that’s a lot of cases, leading to a lot of additional solutions.
If you want to rule out the weird extra solutions, you’ve got to change the statement of the problem in order to give the pirates a reason to vote for the outcomes they want. That’s easy. Instead of assuming that pirates always know one another’s strategies, let’s allow them a little doubt. If Bob the pirate is not sure how the others are voting, it’s always possible he’s got the deciding vote—which makes voting for his preferred outcome the only rational choice.
Why might Bob be unsure of how the others are voting? One possibility: maybe he’s not 100 percent sure they’re all rational. Or maybe he’s not 100 percent sure that they all know they’re all rational. Or maybe he’s not 100 percent sure they all know that they all know that they all know that they’re not rational. If rationality is not common knowledge,* then pirates can never be entirely sure of how other pirates will behave, and can cover all their bases only by voting for their preferred outcomes. In that case the main conclusion is surely right and the alternative solution can be discarded.
So why not avoid all the muss by simply stating the problem a little more realistically in the first place? One answer is that it’s quite standard in lower-level economics courses always to assume that everyone knows everyone else’s strategies for certain. (This assumption gets dropped soon enough in higher-level courses.) Sometimes our students learn to think of that assumption as a harmless simplification. Occasionally even a professor falls into that trap. This problem, and others like it, is a good reminder that this particular simplification can be far from harmless.
A historical note: I assigned this problem to classes for years, expecting (and usually getting) the 96-0-1-0-1-0-1-0-1-0 answer. One day in class, while I was reviewing that “correct” answer, my student Matt Wampler-Doty raised his hand to point out the plethora of alternative solutions and the unjustified assumption in the “official” analysis. I was sure he was wrong and made some inane attempt to dismiss him, but to his credit, he persisted until I got the point. He made my day.