I flipped a few times between 1/2 and 1/3 before realizing that they are both valid answers to different questions.
Say I ask you to draw a card and then, without looking at it, show it to me. I tell you that it is an Ace, and ask you for the probability that you drew the Ace of Spades. Is the answer 1/52, 1/4, or (as you claim about the SB problem) ambiguous?
I think it is clear that I wanted the conditional probability, given the information you have received. Otherwise, what was the point of asking after giving the information?
The "true" halfer position is not that ambiguity; it is that the information SB has received is null, so the conditional probability is the sam...
The wording of the question is ambiguous. It asks for your determination on the likelihood it was heads when you were "first awakened", but by your perception any wakening is you being first awakened. If it is really asking about your determination given you have the information that the question is being asked on your first wakening regardless of your perception, then it's 1/2. If you know the question will be asked on your first or second wakening (though the second one will in the moment feel like the first), then it's 1/3.
The wording may be bad, but I think the second interpretation is what is intended. Otherwise the discussion often seen of "How might your beliefs change if after awakening you were told it is Monday?" would make no sense, since your actual first awakening is always on Monday (though you may experience what feels like a first awakening on Tuesday).
The same problem statement does not mention Monday, Tuesday, or describe any timing difference between a "mandatory" waking and an "optional" one. (There is another element that is missing, that I will defer talking about until I finish this thought.) It just says you will be wakened once or twice. Elga added these elements as part of his solution. They are not part of the problem he asked us to solve.
But that solution added more than just the schedule of wakings. After you are "first awakened," what would change if you are told that the day is Monday? Or ...
You should probably use "last awakening" instead of "first awakening" in your attempt to disambiguation. See Radford Neal's comment for the reason why.
I exchanged a few PMs with a friend who moved my opinion from to , but it was when I hadn't yet thought about the problem much. I'd be extremely surprised if I ever change my mind now (still on ). I don't remember the arguments we made.
Is you current certanity in the correctness of thirdism based on some specific arguments that you remember? I know that there are a lot of arguments for thirdism, but I'd like to find the strongest ones.
Initially, I had a strong feeling/intuition that the answer was 1/3, but felt that because you can also construct a betting situation for 1/2, the question was not decided. In general, I've always found betting arguments the strongest forms of arguments: I don't much care how philosophers feel about what the right way to assign probabilities is, I want to make good decisions in uncertain situations for which betting arguments are a good abstraction. "Rationality is systematized winning" and all that.
Then, I've read this comment, which showed me that I made a mistake by accepting the halfer betting situation as an argument for 1/2. In retrospect, I could have avoided this by actually doing the math, but it's an understandable mistake, people have finite time. In particular, this sentence on the Sleeping Beauty Paradox tag page also makes the mistake: "If Beauty's bets about the coin get paid out once per experiment, she will do best by acting as if the probability is one half." No, as the linked comment shows, it is advantageous to bet 1:1 in some interpretations, but that's exactly because the actual probability is 1/3. Note: there is no rule/axiom that a bet's odds should always correspond with the event's probability, that is something that can be derived in non-anthropic situations assuming rational expected money-maximizing agents. It's more accurate to call what the above situation points to a scoring rule, you can make up situations with other scoring rules too: "Sleeping Beauty, but Omega will kick you in nuts/vulva if you don't say your probability is 7/93." In this case it is also advantageous "to behave as if" the probability is 7/93 in some respect, but the probability in your mind should still be the correct one.
Thank you for bringing this to my attention. As a matter of fact in the linked comment Radford Neal is dealing with a weak-man, while conveniently assuming that other alternatives "are beyond the bounds of rational discussion", which is very much not the case.
But it is indeed a decent argument that deserves a detailed rebuttal. And I'll make sure to provide it in the future.
I confidently switched from 1/3 to 1/2 and then back to 1/3 and then noticed the inconsistency and am now not certain that the question makes sense as posed at all, and I'm not sure what would fix it but maybe specifying better why one wants to know the answer so that it can be answered by decision theory rather than some objective "which one is true".
I was an inveterate thirder until I read a series of articles on repeated betting, which pointed out that in many cases, maximizing expected utility leads to a “heavy tailed” situation in which a few realizations of you have enormous utility, but most realizations of you have gone bankrupt. The mean utility across all realizations is large, but that’s useless in the vast majority of cases because there’s no way to transfer utility from one realization to another. This got me thinking about SB again, and the extent to which Beauties can or can not share or transfer utility between them. I eventually convinced myself of the halfer position.
Here’s the line of reasoning I used. If the coin comes up H, we have one awakening (experience A). If the coin comes up T, we have two awakenings - either in series or in parallel depending on the variant, but in any case indistinguishable. By Bayes, Pr(H|A) = Pr(A|H)Pr(H)/Pr(A). The core insight is that Pr(A|H) = Pr(A|T) = Pr(A) = 1, since you have experience A no matter what the coin flip says. SB is akin to drawing a ball from one of two jars, one of which contains one red ball, and the other of which contains two red balls. Having drawn a red ball, you learn nothing about which jar you drew from.
What about making bets, though? Say that SB is offered a chance to buy a ticket worth $1 if the coin was T, and $0 if it was H. To maintain indistinguishability between the “three Beauties,” each time she is awakened, she must be offered the same ticket. In this case, SB should be willing to pay up to $2/3 for such a ticket. But this is not because the probability of T is really 2/3 - it is because the payoff for T is larger since the bet is made twice in sequence. In the “clones” variant, SB’s valuation of the ticket depends on how she values the welfare of her clone-sister: if she is perfectly selfish she values it at $1/2, whereas if she is perfectly altruistic she values it at $2/3. Again, this is because of variations in the payout - obviously SB’s estimate of the probability of a coin flip cannot depend on whether she is selfish or not!
A lot of anthropic arguments depend on simply “counting up the observers” and using that as a proxy for probability. This is illegitimate, because conditional probabilities must always be normalized to sum to one. Pr(Monday|T) + Pr(Tuesday|T) = 1/2 + 1/2. Any time you use conditional probability you have to be very careful: Pr(Monday|T) != Pr(Monday and T).
I try to avoid any discussion of repeated betting, because of the issues you raise. Doing so addresses the unorthodox part of an unorthodox problem, and so can be used to get either solution you prefer.
But that unorthodox part is unnecessary. In my comment to pathos_bot, I pointed out that there are significant differences between the problem as Elga posed it, and the problem as it is used in the controversy. It the posed problem, the probability question is asked before you are put to sleep, and there is no Monday/Tuesday schedule. In his solution, Elga n...
I have weak intuitions for these problems, and in net they make me feel like my brain doesn't work very well. With that to disclaim my taste, FWIW I think your posts are some of the most interesting content on modern day LW.
It'd be fun to hear you debate anthropic reasoning with Robin Hanson esp. since you invoke grabby aliens. Maybe you could invite yourself on to Robin & Agnes' podcast.
FWIW I think your posts are some of the most interesting content on modern day LW.
Thank you for such a high praise! It was unexpected and quite flattering.
https://en.wikipedia.org/wiki/Sleeping_Beauty_problem
If that happened what was the argument that did it for you?
I'm interested in situations where a person used to think that the correct answer was 1/2 and then, on a reflection, decided that it's actually 1/3 , or vice versa, not when the resulting belief is that the question is meaningless or both answers are valid.