Probability puzzle: Coins in envelopes

8D_Alex

0magfrump

6Lapsed_Lurker

0magfrump

6DanielLC

3red75

2cousin_it

1jefftk

9[anonymous]

2buybuydandavis

0HonoreDB

1Anatoly_Vorobey

1HonoreDB

0Manfred

3Anatoly_Vorobey

0Manfred

2Anatoly_Vorobey

0Manfred

New Comment

18 comments, sorted by Click to highlight new comments since: Today at 8:40 AM

Problem 2 by Bayes rule.

N is a random variable (RV) of number of filled envelopes.

C is a RV of selected envelope contains coin. P(C) means P(C=true) when appropriate.

Prior distribution

```
P(N=n) = 1/(m+1)
```

by the problem setup

```
P(C|N=n) = n/m
```

by the rule of total probability

```
P(C)=sum_n P(C|N=n)P(N=n) = sum_n (n/m/(m+1))=m(m+1)/2/m/(m+1)=1/2
```

by Bayes rule

```
P(N=n|C) = P(C|N=n)P(N=n)/P(C) = 2n/m/(m+1)
```

Let C' is a RV of picking filled envelope second time.

by the problem statement

```
P(C'|N=n,C) = (n-1)/m
```

by the rule of total probability

```
P(C'|C)=sum_n P(C'|N=n,C)P(N=n|C) = ... substitutions and simplifications ... = 2(m-1)/(3m)
```

solving P(C'|C)=P(C) obtains

m=4

I must be missing something: both 1 and 2 appear to be impossible. He picked a random envelope, it had a coin, he removed it, she put that envelope, now empty, back with the others. How can the expected value of picking an envelope not have decreased? There are the same number of envelopes, and one fewer coin.

He also finds out that the envelope wasn't empty. If the envelopes are positively correlated -- knowing that one envelope has money increases the probability that another envelope has money -- then this can counteract the effect of replacing this envelope. The trick is to make the correlations be such that the two effects cancel out exactly.

Yeah, that's the solution. And one that should be obvious to anyone familiar with Jaynes. Probabilities are about states of knowledge, not about physical propensity.

Unfortunately, although I'm familiar with Jaynes, I jumped to a propensity interpretation. Fewer coins *must* mean a lower chance of picking a coin - which it obviously would, for someone whose estimate of the total number of coins doesn't change. And then I spent my energy marshaling the arguments why these wackos must be wrong and don't know diddley.

My takeaway is that it is usually more useful to ask - how something could be true, than why it must be false. I think the solution would have been obvious if I spent my energy looking for it, instead of denying it's existence.

Also, I should takes pains to keep in mind that the other guy isn't a moron, and when I limit myself to 5 seconds of honest reflection on a problem, I am.

Yeah, that's the solution. And one that should be obvious to anyone familiar with Jaynes. Probabilities are about states of knowledge, not about physical propensity.

That's why I expected to "get" fewer people on LW than on xkcd. One welcome surprise was that it seems to have served as an intuition pump for one person over there, who had, only a few days earlier, written

The point is that you cannot, from the observations described, exactly determine the probability...

This same person initially responded that the problem was impossible, but then was enlightened:

So Bob's probabilities are a function of Bob's knowledge...Mea culpa.

I'm having trouble understanding how you (and buybuydandavis) see this puzzle as illustrating (or evidencing?) a subjective approach to probability. Wouldn't it be perfectly solvable in the frequency/propensity approaches in just the same way? Conditional probability and the Bayes rule work the same way everywhere.

(I haven't read Jaynes yet) (Also enjoyed working out your puzzle, and reposted it in my blog, hope you don't mind)

Interesting, there was recently a somewhat related question posed here.

Using my experience from that question I can give a pretty large group of answers for problem 1. Pubbfr nal z, naq nffvta n ahzore bs pbvaf tvira ol n ovabzvny qvfgevohgvba jvgu a=z naq fbzr c xabja gb Obo, fb gung vg'f nf vs lbh syvccrq na vaqrcraqrag pbva sbe rnpu rairybcr.

2 is a bit tricky, since perfect mathematicians don't just eliminate the obviously wrong, they also update against the unlikely but right. Bayes' rule says that if you get a coin, you multiply your probabilities by P(got coin | X envelopes filled)/P(got coin). P(got coin) is 1/2, your chance of getting a coin if there are X coins is X/m, and your prior that there's X coins is 1/(m+1) (since 0 is a valid number of coins too). So after getting a coin, the hypothesis that there are X envelopes with coins in them gets probability 2*X/m * 1/(m+1).

*Gut check stop. This means that for m=2, Bob would say, P(0) = 0, P(1) = 1/3, and P(2) = 2/3 after getting one coin. Looks right.*

Each hypothesis leads to an expected value of (X-1)/(m-1). So we take the sum of 2X(X-1) / (m-1)m(m+1) (thanks wolfram) to get 2/3. No matter the m, the expected value for the second draw is 2/3! It's Laplace's rule of succession! Cool, huh? I'm going and giving damang an upvote just for how helpful his post was for this one. Shame about it making the problem unanswerable :P

EDIT: part two was answering the wrong question, see comment.

[This comment is no longer endorsed by its author]

This went over well in the xkcd logic puzzle forum (my hand was not removed), so I thought I'd try it here. It came to me in a dream, so by solving it you may be helping to summon an elder god or something.

Bob replies, "That depends on what random function you used to choose how many envelopes to fill. If you, say, flipped m coins and put each one that came up heads in an envelope, the expected value is $.50."

Alice explains what her random function was, and Bob calculates the expected value. For kicks, he pays her that amount, and she lets him pick a random envelope. It has a coin in it! Bob pockets the coin. Alice then takes the now-empty envelope back, and shuffles it into the others. "Congratulations," she says. "So, what's the expected value of playing the game again, now that there's one fewer coin?"

"Same as before," Bob replies.

Problem 1: Give a value for m and a random function for which this makes sense (there are many).