# Why rational to be indifferent between two urns, when urn A has 50-50 red and white balls, but you don't know urn B's ratio?

Please see the embolden sentence below. Assume that I'm risk adverse and "prefer the known chance over the unknown". Why's it irrational for me to choose A?

Also, there were problems on the probability side. One famous debate concerned a paradox posed by Daniel Ellsberg (of later fame due to publishing the Pentagon Papers) It involved multiple urns, some with known and some with unknown odds of drawing a winning ball. Instead of estimating the expected value of the unknown probability, and sticking with that estimate, most people exhibit strong aversion to ambiguity in violation of basic probability principles. A simpler version of the paradox would be as follows. You can choose one of two urns, each containing red and white balls. If you draw red you win $100 and nothing otherwise. You know that urn A has exactly a 50-50 ratio of red and white balls. In urn B, the ratio is unknown. From which urn do you wish to draw? Most people say A since they prefer the known chance over the unknown, especially since some suspect that urn B is perhaps stacked against them. But even if people can choose the color on which to bet, they still prefer A.

Rationally, you should be indifferent, or if you think you can guess the color ratios, choose the urn with the better perceived odds of winning. Yet, smart people would knowingly violate this logical advice.

Paul Slovic, *The Irrational Economist* (2010), p 56.

## 2 answers

Let's assume you know that urn B has 5 balls in it. I deliberately take an odd number, because that way we know for sure that there are *not* exactly the same number of red and white balls in that urn.

Note that since you don't know the content of the urn, you have to assign probabilities to the urn.

Now what could the content of the urn be? Well, for example, it could have 1 red and 4 white balls. But then, it also could have 1 white and 4 red balls in it. So which of those is more likely? Well, unless you have any reason to assume that the urn contains more white than red, or more red than white urns, you have to assign the same probability to both.

Now for *any* number of balls in the urn, exchanging red and white balls gives another possible content of the urn, and the same argument as above gives equal probabilities for both of those contents.

So what is the probability to draw a red ball from urn b, provided that it has 5 balls? Well, let's denote the probability of drawing red with $p(R)$, and the probability of drawing red given that there are $n$ red balls (and $5-n$ white balls) in the urn with $p(R|n)$. Then Bayes' Theorem gives us: $$p(R) = \frac05 p(R|0) + \frac15 p(R|1) + \frac25 p(R|2) + \frac35 p(R|3) + \frac45 p(R|4) + \frac55 p(R|5)$$ But by the argument above, $p(R|n) = p(R|5-n)$, therefore the above simplifies to $$p(R) = (\frac05+\frac55)p(R|0) + (\frac15+\frac45)p(R|1) + (\frac23+\frac35)p(R|2) = p(R|0) + p(R|1) + p(R|2) = \frac12$$ where the last equality is again because of the symmetry, and the fact that all probabilities have to add to $1$.

Now this analysis works not just for $5$ balls, but for any odd number of balls, and with a minor change also for all even numbers of balls. Thus no matter how many balls there are in urn B, the probability of drawing a red ball will always turn out to be $1/2$. For this reason, it also doesn't matter that you don't actually know the number of balls in urn B (except that of course there has to be at least one ball in it).

Now whether it is really *irrational* to choose urn A over urn B is a completely different question. I think the text is wrong in claiming this. It is true that the expectation value is the same. But the expectation value is not everything.

Consider the specific case that urn A contains one red and one white b all, while urn B can with equal probability contain two white balls, a white and a red ball, or two red balls. Note that here we are in a *better* situation than in the original puzzle because we are actually given both the possible contents of the urn and the corresponding probabilities.

Now let's consider that we play two rounds. Obviously the expectation value is to win one of those rounds, no matter which of the urns we choose. Thus according to that text, both choices should be equivalent.

But let us ask a different question: What is the probability that we don't win anything? Well, with urn A, the probability clearly is $1/4$: There are four different outcomes, and for only one of them the white ball is drawn twice. But for urn B, with probability $1/3$ we have an urn where you are *guaranteed* to get a white ball twice, and with another probability $1/3$, you get the same urn as A, with probability $1/4$. Therefore the probability of not winning either game is $5/12$, which is considerably higher than $1/4$.

In other words, with urn B indeed the *risk* is higher, although the probability of winning a single game (and therefore the expectation value) is equal. And thus if you are risk averse, choosing A over B is indeed rational.

Anyone arguing otherwise would also have to argue that betting on getting a billion dollars with a probability of $1/1000000$ is equivalent to getting a thousand dollars for sure.

#### 1 comment thread

Say you have a coin and, if flipped, will land either heads or tails. What is the probability that it lands, say, heads? The "real" answer is that the probability is unknown. The information was not given at the start. We cannot proceed further then. But if we insist on moving on, we have to have a number. So we *assume* the probability is *exactly* 1/2 because there is one desired outcome (heads) and there are two possible outcomes (heads, tails). Because no information is given, we have no reason to think that heads are more likely or that tails are more likely.

Say an urn C has exactly one ball, either a red ball or a white ball. That's the only information you have. What then is the probability that a, say, red ball is drawn? The *real* answer is that the probability is unknown. But if we insist on moving on, we *assume* the probability is *exactly* 1/2.

Urn B has only red or white balls. We don't know how many of each there are. What is the probability that, say, a red ball is drawn? We *assume* the probability is *exactly* 1/2. This is the same as the probability for urn A. And since the probabilities are the same for urn A and for urn B, there is no reason to prefer one over the other.

## 1 comment thread