Posts

Sorted by New

Wiki Contributions

Comments

There is new information in the first scenario, but how does it allow you to update the probability that the coins are different without thinking of today as randomly selected?

Imagine you are woken up every day, but the color of the room may be different. You are asked the probability that the coins are different.

HH: blue blue
HT: blue red
TH: red blue
TT: red red

Now you wake up and see "blue." That is new information. You now know that there is at least one "blue", and you can eliminate TT. 

However, I think everyone would agree that the probability is still 1/2. It was 1/2 to begin with, and seeing "blue" as opposed to "red", while it is new information, is not relevant to deciding the coins are different.

Back to scenario 1:

HH: wake wake
HT: wake sleep
TH: sleep wake
TT: sleep sleep

Now you wake up. That is new information, and you can eliminate TT. But the question is, how is that relevant to the coins being different? If you are treating "today" as randomly selected from all days that exist in reality, then that would allow you to update. But if you are not treating "today" as randomly selected at all, then by what mechanism can you update?

Just going by intuition, I personally don't think you should update. In this scenario the coin doesn't need to be tossed until the morning. Heads they wake you up, tails they don't. So when you wake up, you do get new information just like in the blue/red example. But since the coins are independent of each other, how can learning about that morning's coin tell you something about the other coin you don't see? Unless you are using a random selection process in which "today" not primitive.

I find this idea very interesting, especially since it seems to me that it gives different probabilities than most other version of halfing. I wonder if you agree with me about how it would answer this scenario (due to Conitzer):

Two coins are tossed on Sunday. The possibilities are

HH: wake wake
HT: wake sleep
TH: sleep wake
TT: sleep sleep

When you wake up (which is not guaranteed now), what probability should you give that the coins come up differently?

According to most versions of halfing, it would be 2/3. You could say that when you wake up you learn that you wake up at least once, eliminating TT. Alternatively, you could say that when you wake up the day is selected randomly from all the days you wake up in reality. Either way you get 2/3.

However, what if we say that "today" is not selected at all from our perspective? If "today" wasn't selected at all, it can't possibly tell us anything about the other day. So it would be 1/2 probability that the coins are different.

The weird thing about this is that if we change the situation into:

HH: wake wake
HT: wake sleep
TH: sleep wake
TT: wake wake

Now it seems like we are back to the original sleeping beauty problem, where again we would say 1/2 for the probability that the coins are different. How can the probability not change despite TT: sleep sleep turning into TT: wake wake?

And yet, from my own perspective, I could still say that "today" was not selected. So it still gives me no information about whether the other coin is different, and the probability has to stay at 1/2.

I'm talking about the method you're using. It looks like when you wake up and experience y you are treating that as equivalent to "I experience y at least once."

This method is generally incorrect, as shown in the example. Waking up and experiencing y is not necessarily equivalent to "I experience y at least once."

If you yourself believe the method is incorrect when y is "flip heads", why should we believe it is correct when y is something else?

The question is about what information you actually have.

In the linked example, it may seem that you have precisely the information "there is at least one heads." But if you condition on that you get the wrong answer. The explanation is that, in this type of memory loss situation, waking up and experiencing y is not equivalent to "I experience y at least once." When you wake up and experience y you do know that you must experience y on either monday or tuesday, but your information is not equivalent to that statement.

If you asked on sunday "will I experience y at least once?" then the answer would be relevant. But if we nailed down the precise information gained from waking up and experiencing y, it would be irrelevant.

I'm referring to an example from here: https://users.cs.duke.edu/~conitzer/devastatingPHILSTUD.pdf where you do wake up both days.

Your argument seemed similar, but I may be misunderstanding:

"Treating these and other differences as random, the probability of Beauty having at some time the exact memories and experiences she has after being woken this time is twice as great if the coin lands Tails than if the coin lands Heads, since with Tails there are two chances for these experiences to occur rather than only one."

It sounds like you are conditioning on "at least once such experiences occur". That is, if Beauty wakes up and flips a coin, getting heads, and that's the only experience she has so far, she will condition on "at least one heads." This doesn't seem generally correct, as the linked example covers. Doesn't it also mean that, even before the coin flip, she would know exactly how she was going to update her probability afterward, regardless of result?

Perhaps the issue here is that if you wake up and flip heads, that isn't the same thing as if, on Sunday, you asked "will I flip at least one heads?" and got an affirmative answer. The latter is relevant to the number of wakings. The former is not.

I don't understand the reasoning for using irrelevant information.

If you are saying that there is twice the probability of experiencing y "at least once" on tails, doesn't that fail for the same argument Conitzer gave against halfers? His example was that you wake up both days and flip a coin. If you flip heads, what is the probability that both flips are the same? You are twice as likely to experience heads at least once if the coin tosses are different. But it is irrelevant. The probability of "both the same" is still 1/2.

On the other hand, in reality there might be some relevant information (such as noticeable aging, hunger, etc) but the problem is meant to exclude that.