Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Q&A

Post History

#3: Post edited by user avatar celtschk‭ · 2021-07-11T15:05:29Z (over 3 years ago)
Another interference
  • Let's assume you know that urn B has 5 balls in it. I deliberately take an odd number, because that way we know for sure that there are *not* exactly the same number of red and white balls in that urn.
  • Note that since you don't know the content of the urn, you have to assign probabilities to the urn.
  • Now what could the content of the urn be? Well, for example, it could have 1 red and 4 white balls. But then, it also could have 1 white and 4 red balls in it. So which of those is more likely? Well, unless you have any reason to assume that the urn contains more white than red, or more red than white urns, you have to assign the same probability to both.
  • Now for *any* number of balls in the urn, exchanging red and white balls gives another possible content of the urn, and the same argument as above gives equal probabilities for both of those contents.
  • So what is the probability to draw a red ball from urn b, provided that it has 5 balls? Well, let's denote the probability of drawing red with $p(R)$, and the probability of drawing red given that there are $n$ red balls (and $5-n$ white balls) in the urn with $p(R|n)$. Then Bayes' Theorem gives us:
  • $$p(R) = \frac05 p(R|0) + \frac15 p(R|1)
  • + \frac25 p(R|2) + \frac35 p(R|3) + \frac45 p(R|4) + \frac55 p(R|5)$$
  • But by the argument above, $p(R|n) = p(R|5-n)$, therefore the above simplifies to
  • $$p(R) = (\frac05+\frac55)p(R|0) + (\frac15+\frac45)p(R|1) + (\frac23+\frac35)p(R|2)
  • = p(R|0) + p(R|1) + p(R|2) = \frac12$$
  • where the last equality is again because of the symmetry, and the fact that all probabilities have to add to $1$.
  • Now this analysis works not just for $5$ balls, but for any odd number of balls, and with a minor change also for all even numbers of balls. Thus no matter how many balls there are in urn B, the probability of drawing a red ball will always turn out to be $1/2$. For this reason, it also doesn't matter that you don't actually know the number of balls in urn B (except that of course there has to be at least one ball in it).
  • Now whether it is really *irrational* to choose urn A over urn B is a completely different question. I think the text is wrong in claiming this. It is true that the expectation value is the same. But the expectation value is not everything.
  • Consider the specific case that urn A contains one red and one white b all, while urn B can with equal probability contain two white balls, a white and a red ball, or two red balls. Note that here we are in a *better* situation than in the original puzzle because we are actually given both the possible contents of the urn and the corresponding probabilities.
  • Now let's consider that we play two rounds. Obviously the expectation value is to win one of those rounds, no matter which of the urns we choose. Thus according to that text, both choices should be equivalent.
  • But let us ask a different question: What is the probability that we don't win anything? Well, with urn A, the probability clearly is $1/4$: There are four different outcomes, and for only one of them the white ball is drawn twice. But for urn B, with probability $1/3$ we have an urn where you are *guaranteed* to get a white ball twice, and with another probability $1/3$, you get the same urn as A, with probability $1/4$. Therefore the probability of not winning either game is $5/12$, which is considerably higher than $1/4$.
  • In other words, with urn B indeed the *risk* is higher, although the probability of winning a single game (and therefore the expectation value) is equal. And thus if you are risk averse, choosing A over B is indeed rational.
  • Anyone arguing otherwise would also have to argue that betting on getting a billion dollars with a probability of $1/1000000$ is equivalent to getting a thousand dollars for sure.
  • Let's assume you know that urn B has 5 balls in it. I deliberately take an odd number, because that way we know for sure that there are *not* exactly the same number of red and white balls in that urn.
  • Note that since you don't know the content of the urn, you have to assign probabilities to the urn.
  • Now what could the content of the urn be? Well, for example, it could have 1 red and 4 white balls. But then, it also could have 1 white and 4 red balls in it. So which of those is more likely? Well, unless you have any reason to assume that the urn contains more white than red, or more red than white urns, you have to assign the same probability to both.
  • Now for *any* number of balls in the urn, exchanging red and white balls gives another possible content of the urn, and the same argument as above gives equal probabilities for both of those contents.
  • So what is the probability to draw a red ball from urn b, provided that it has 5 balls? Well, let's denote the probability of drawing red with $p(R)$, and the probability of drawing red given that there are $n$ red balls (and $5-n$ white balls) in the urn with $p(R|n)$. Then Bayes' Theorem gives us:
  • $$p(R) = \frac05 p(R|0) + \frac15 p(R|1) + \frac25 p(R|2) + \frac35 p(R|3) + \frac45 p(R|4) + \frac55 p(R|5)$$
  • But by the argument above, $p(R|n) = p(R|5-n)$, therefore the above simplifies to
  • $$p(R) = (\frac05+\frac55)p(R|0) + (\frac15+\frac45)p(R|1) + (\frac23+\frac35)p(R|2)
  • = p(R|0) + p(R|1) + p(R|2) = \frac12$$
  • where the last equality is again because of the symmetry, and the fact that all probabilities have to add to $1$.
  • Now this analysis works not just for $5$ balls, but for any odd number of balls, and with a minor change also for all even numbers of balls. Thus no matter how many balls there are in urn B, the probability of drawing a red ball will always turn out to be $1/2$. For this reason, it also doesn't matter that you don't actually know the number of balls in urn B (except that of course there has to be at least one ball in it).
  • Now whether it is really *irrational* to choose urn A over urn B is a completely different question. I think the text is wrong in claiming this. It is true that the expectation value is the same. But the expectation value is not everything.
  • Consider the specific case that urn A contains one red and one white b all, while urn B can with equal probability contain two white balls, a white and a red ball, or two red balls. Note that here we are in a *better* situation than in the original puzzle because we are actually given both the possible contents of the urn and the corresponding probabilities.
  • Now let's consider that we play two rounds. Obviously the expectation value is to win one of those rounds, no matter which of the urns we choose. Thus according to that text, both choices should be equivalent.
  • But let us ask a different question: What is the probability that we don't win anything? Well, with urn A, the probability clearly is $1/4$: There are four different outcomes, and for only one of them the white ball is drawn twice. But for urn B, with probability $1/3$ we have an urn where you are *guaranteed* to get a white ball twice, and with another probability $1/3$, you get the same urn as A, with probability $1/4$. Therefore the probability of not winning either game is $5/12$, which is considerably higher than $1/4$.
  • In other words, with urn B indeed the *risk* is higher, although the probability of winning a single game (and therefore the expectation value) is equal. And thus if you are risk averse, choosing A over B is indeed rational.
  • Anyone arguing otherwise would also have to argue that betting on getting a billion dollars with a probability of $1/1000000$ is equivalent to getting a thousand dollars for sure.
#2: Post edited by user avatar celtschk‭ · 2021-07-11T15:04:25Z (over 3 years ago)
Fixed markdown interferring with MathJax
  • Let's assume you know that urn B has 5 balls in it. I deliberately take an odd number, because that way we know for sure that there are *not* exactly the same number of red and white balls in that urn.
  • Note that since you don't know the content of the urn, you have to assign probabilities to the urn.
  • Now what could the content of the urn be? Well, for example, it could have 1 red and 4 white balls. But then, it also could have 1 white and 4 red balls in it. So which of those is more likely? Well, unless you have any reason to assume that the urn contains more white than red, or more red than white urns, you have to assign the same probability to both.
  • Now for *any* number of balls in the urn, exchanging red and white balls gives another possible content of the urn, and the same argument as above gives equal probabilities for both of those contents.
  • So what is the probability to draw a red ball from urn b, provided that it has 5 balls? Well, let's denote the probability of drawing red with $p(R)$, and the probability of drawing red given that there are $n$ red balls (and $5-n$ white balls) in the urn with $p(R|n)$. Then Bayes' Theorem gives us:
  • $$p(R) = \frac05 p(R|0) + \frac15 p(R|1)
  • + \frac25 p(R|2) + \frac35 p(R|3) + \frac45 p(R|4) + \frac55 p(R|5)$$
  • But by the argument above, $p(R|n) = p(R|5-n)$, therefore the above simplifies to
  • $$p(R) = (\frac05+\frac55)p(R|0)
  • + (\frac15+\frac45)p(R|1)
  • + (\frac23+\frac35)p(R|2)
  • = p(R|0) + p(R|1) + p(R|2) = \frac12$$
  • where the last equality is again because of the symmetry, and the fact that all probabilities have to add to $1$.
  • Now this analysis works not just for $5$ balls, but for any odd number of balls, and with a minor change also for all even numbers of balls. Thus no matter how many balls there are in urn B, the probability of drawing a red ball will always turn out to be $1/2$. For this reason, it also doesn't matter that you don't actually know the number of balls in urn B (except that of course there has to be at least one ball in it).
  • Now whether it is really *irrational* to choose urn A over urn B is a completely different question. I think the text is wrong in claiming this. It is true that the expectation value is the same. But the expectation value is not everything.
  • Consider the specific case that urn A contains one red and one white b all, while urn B can with equal probability contain two white balls, a white and a red ball, or two red balls. Note that here we are in a *better* situation than in the original puzzle because we are actually given both the possible contents of the urn and the corresponding probabilities.
  • Now let's consider that we play two rounds. Obviously the expectation value is to win one of those rounds, no matter which of the urns we choose. Thus according to that text, both choices should be equivalent.
  • But let us ask a different question: What is the probability that we don't win anything? Well, with urn A, the probability clearly is $1/4$: There are four different outcomes, and for only one of them the white ball is drawn twice. But for urn B, with probability $1/3$ we have an urn where you are *guaranteed* to get a white ball twice, and with another probability $1/3$, you get the same urn as A, with probability $1/4$. Therefore the probability of not winning either game is $5/12$, which is considerably higher than $1/4$.
  • In other words, with urn B indeed the *risk* is higher, although the probability of winning a single game (and therefore the expectation value) is equal. And thus if you are risk averse, choosing A over B is indeed rational.
  • Anyone arguing otherwise would also have to argue that betting on getting a billion dollars with a probability of $1/1000000$ is equivalent to getting a thousand dollars for sure.
  • Let's assume you know that urn B has 5 balls in it. I deliberately take an odd number, because that way we know for sure that there are *not* exactly the same number of red and white balls in that urn.
  • Note that since you don't know the content of the urn, you have to assign probabilities to the urn.
  • Now what could the content of the urn be? Well, for example, it could have 1 red and 4 white balls. But then, it also could have 1 white and 4 red balls in it. So which of those is more likely? Well, unless you have any reason to assume that the urn contains more white than red, or more red than white urns, you have to assign the same probability to both.
  • Now for *any* number of balls in the urn, exchanging red and white balls gives another possible content of the urn, and the same argument as above gives equal probabilities for both of those contents.
  • So what is the probability to draw a red ball from urn b, provided that it has 5 balls? Well, let's denote the probability of drawing red with $p(R)$, and the probability of drawing red given that there are $n$ red balls (and $5-n$ white balls) in the urn with $p(R|n)$. Then Bayes' Theorem gives us:
  • $$p(R) = \frac05 p(R|0) + \frac15 p(R|1)
  • + \frac25 p(R|2) + \frac35 p(R|3) + \frac45 p(R|4) + \frac55 p(R|5)$$
  • But by the argument above, $p(R|n) = p(R|5-n)$, therefore the above simplifies to
  • $$p(R) = (\frac05+\frac55)p(R|0) + (\frac15+\frac45)p(R|1) + (\frac23+\frac35)p(R|2)
  • = p(R|0) + p(R|1) + p(R|2) = \frac12$$
  • where the last equality is again because of the symmetry, and the fact that all probabilities have to add to $1$.
  • Now this analysis works not just for $5$ balls, but for any odd number of balls, and with a minor change also for all even numbers of balls. Thus no matter how many balls there are in urn B, the probability of drawing a red ball will always turn out to be $1/2$. For this reason, it also doesn't matter that you don't actually know the number of balls in urn B (except that of course there has to be at least one ball in it).
  • Now whether it is really *irrational* to choose urn A over urn B is a completely different question. I think the text is wrong in claiming this. It is true that the expectation value is the same. But the expectation value is not everything.
  • Consider the specific case that urn A contains one red and one white b all, while urn B can with equal probability contain two white balls, a white and a red ball, or two red balls. Note that here we are in a *better* situation than in the original puzzle because we are actually given both the possible contents of the urn and the corresponding probabilities.
  • Now let's consider that we play two rounds. Obviously the expectation value is to win one of those rounds, no matter which of the urns we choose. Thus according to that text, both choices should be equivalent.
  • But let us ask a different question: What is the probability that we don't win anything? Well, with urn A, the probability clearly is $1/4$: There are four different outcomes, and for only one of them the white ball is drawn twice. But for urn B, with probability $1/3$ we have an urn where you are *guaranteed* to get a white ball twice, and with another probability $1/3$, you get the same urn as A, with probability $1/4$. Therefore the probability of not winning either game is $5/12$, which is considerably higher than $1/4$.
  • In other words, with urn B indeed the *risk* is higher, although the probability of winning a single game (and therefore the expectation value) is equal. And thus if you are risk averse, choosing A over B is indeed rational.
  • Anyone arguing otherwise would also have to argue that betting on getting a billion dollars with a probability of $1/1000000$ is equivalent to getting a thousand dollars for sure.
#1: Initial revision by user avatar celtschk‭ · 2021-07-11T15:01:13Z (over 3 years ago)
Let's assume you know that urn B has 5 balls in it. I deliberately take an odd number, because that way we know for sure that there are *not* exactly the same number of red and white balls in that urn.

Note that since you don't know the content of the urn, you have to assign probabilities to the urn.

Now what could the content of the urn be? Well, for example, it could have 1 red and 4 white balls. But then, it also could have 1 white and 4 red balls in it. So which of those is more likely? Well, unless you have any reason to assume that the urn contains more white than red, or more red than white urns, you have to assign the same probability to both.

Now for *any* number of balls in the urn, exchanging red and white balls gives another possible content of the urn, and the same argument as above gives equal probabilities for both of those contents.

So what is the probability to draw a red ball from urn b, provided that it has 5 balls? Well, let's denote the probability of drawing red with $p(R)$, and the probability of drawing red given that there are $n$ red balls (and $5-n$ white balls) in the urn with $p(R|n)$. Then Bayes' Theorem gives us:
$$p(R) = \frac05 p(R|0) + \frac15 p(R|1)
+ \frac25 p(R|2) + \frac35 p(R|3) + \frac45 p(R|4) + \frac55 p(R|5)$$
But by the argument above, $p(R|n) = p(R|5-n)$, therefore the above simplifies to
$$p(R) = (\frac05+\frac55)p(R|0)
 + (\frac15+\frac45)p(R|1)
 + (\frac23+\frac35)p(R|2)
 = p(R|0) + p(R|1) + p(R|2) = \frac12$$
where the last equality is again because of the symmetry, and the fact that all probabilities have to add to $1$.

Now this analysis works not just for $5$ balls, but for any odd number of balls, and with a minor change also for all even numbers of balls. Thus no matter how many balls there are in urn B, the probability of drawing a red ball will always turn out to be $1/2$. For this reason, it also doesn't matter that you don't actually know the number of balls in urn B (except that of course there has to be at least one ball in it).

Now whether it is really *irrational* to choose urn A over urn B is a completely different question. I think the text is wrong in claiming this. It is true that the expectation value is the same. But the expectation value is not everything.

Consider the specific case that urn A contains one red and one white b all, while urn B can with equal probability contain two white balls, a white and a red ball, or two red balls. Note that here we are in a *better* situation than in the original puzzle because we are actually given both the possible contents of the urn and the corresponding probabilities.

Now let's consider that we play two rounds. Obviously the expectation value is to win one of those rounds, no matter which of the urns we choose. Thus according to that text, both choices should be equivalent.

But let us ask a different question: What is the probability that we don't win anything? Well, with urn A, the probability clearly is $1/4$: There are four different outcomes, and for only one of them the white ball is drawn twice. But for urn B, with probability $1/3$ we have an urn where you are *guaranteed* to get a white ball twice, and with another probability $1/3$, you get the same urn as A, with probability $1/4$. Therefore the probability of not winning either game is $5/12$, which is considerably higher than $1/4$.

In other words, with urn B indeed the *risk* is higher, although the probability of winning a single game (and therefore the expectation value) is equal. And thus if you are risk averse, choosing A over B is indeed rational.

Anyone arguing otherwise would also have to argue that betting on getting a billion dollars with a probability of $1/1000000$ is equivalent to getting a thousand dollars for sure.