Disclaimer: this is a personal blog post that I'm posting for my benefit and do not expect to be valuable to LessWrong readers in general. If you're reading this, consider not.

Second disclaimer: this has at least one error I need to fix.

This post analyzes the "Snake Eyes Paradox" as specified by Daniel Reeves.

Problem statement

You're offered a gamble where a pair of six-sided dice are rolled and unless they come up snake eyes you get a bajillion dollars. If they do come up snake eyes, you're devoured by snakes.

So far it sounds like you have a 1/36 chance of dying, right?

Now the twist. First, I gather up an unlimited number of people willing to play the game. I take 1 person from that pool and let them play. Then I take 2 people and have them play together, where they share a dice roll and either get the bajillion dollars each or both get devoured. Then I do the same with 4 people, and then 8, 16, and so on.

At some point one of those groups will be devoured by snakes and then I stop.

Is the probability that you'll die, given that you're chosen to play, still 1/36?

This is very similar to the Shooting Room Paradox.

Less death please

People are known to have rationality issues around death. I expect that philosophers prefer death-related thought experiments for signaling reasons. I don't need to send those signals, so I would like to discuss a less violent equivalent:

I am a god, creating snakes. I group the snakes into exponential batches. The first batch is one snake, the second batch is two snakes, etc. After gathering each batch, I roll two dice. If I get snake eyes, I color all their eyes red and stop making snakes. Otherwise I color their eyes blue and continue to the next batch.

You are a snake. What is the probability that your eyes are red?

Anthropics

Depending on the dice results, there might be more or less snakes. As a snake, we are more likely to exist in worlds with more snakes. This is awkward because anthropic probability appears to be an unsolved philosophical problem. Should we use Self-Indication Assumption or Self-Sampling Assumption?

I'm going to proceed with both.

Conditional limit

We're going to condition the universe on cases where the deity happens to stop after N rounds or less, and calculate the conditional probability of red eyes in that universe. Then we're going to take the limit as N tends to infinity.

Because I'm not a real mathematician, I'm not going to find the actual limit, but just show that the limit is at least 50%.

N = 1

When N = 1 then there is a single snake with red eyes. The probability of a randomly chosen snake having red eyes is 100%. This is > 50%.

N = 2

When N = 2 we have two possible scenarios:

  • S1: There is a single snake with red eyes. P(red) = 100%.
  • S2: There is one snake with blue eyes and two snakes with red eyes. P(red) = 66.7%.

Because both scenarios have P(red) > 50%, the combined P(red) > 50%.

This is all we need, but let's do the exact calculation for fun.

From the deity's perspective, P(S1) = 1/36, P(S2) = 35/36x1/36. Therefore P(S1 given N=2) = 36/(36+35) = 36/71. This is also the probability under SSA anthropics. Therefore P(red)=83.6%

From the snake's perspective, under SIA anthropics, there are three more snakes in S2, so it is three times more likely for a snake to find itself in S2. So P(S1 given N=2) = 36/(36+35+35+35) = 36/141. Therefore P(red)=75.2%.

N = lots

For all values of N we find that there are N possible scenarios, S1...SN. In each possible scenario, P(red)>50%. Therefore the combined probability P(red)>50%.

Can we take the limit?

From the deity's perspective, or under SSA anthropics, we can absolutely take the limit. The probability of the game ending before we reach N tends to zero, so the conditional probability approaches the complete probability. Therefore Limit(P(red)) >= 50%.

From the snake's perspective, under SSI anthropics, we can't take the limit. For any value of N, most snakes are living in worlds with greater values of N. This is because the number of snakes in each batch increases faster than the probability of reaching the batch decreases.

Limit for a blue-loving deity

See Daniel Reeves treatment of the problem for a comparison. For this limit we first address a different problem.

I am a god, creating snakes, as before. But after N batches I get bored of snakes, and I leave all the snakes with blue eyes.

N = 1

When N = 1, there is a single snake. It has a 1/36 chance of having red eyes (2.8%)

N = 2

When N = 2, we have three possible scenarios:

  • S1: There is a single snake with red eyes. P(red) = 100%.
  • S2: There is one snake with blue eyes and two snakes with 1/36 chance of red eyes. P(red) = 2/3 x 1/36 = 1.8%

From the deity's perspective, or from the snake's perspective under SSA, the probability of each scenario is:

P(S1) = 1/36 P(S2) = 35/36

This results in overall P(red)=4.6%.

From the snake's perspective, under SIA anthropics, there are three more snakes in S2, so it is three times more likely for a snake to find itself in S2. Instead of imagining one S1 world and 35 S2 worlds, based on the raw odds, we imagine one S1 world and 105 S2 worlds. So P(S1)=1/106. This results in overall P(red) = 2.8%. This is 1/36.

N = more

As N increases, P(red) continues to creep up under SSA anthropics, and is certainly > 1/36. I leave calculating its value as an exercise for the reader, or possibly for myself on another night. P(red) stays at 1/36 under SIA anthropics.

Can we take the limit?

This is the same as the first attempt. Under SSA anthropics, we can take the limit, as the probability of all snakes being blue-eyed tends to zero. Under SSI anthropics we can't take the limit, as most snakes continue to live in worlds where all snakes are blue-eyed.

Limit for a red-loving deity

Here is another possible limiting process.

I am a god, creating snakes, as before. But after N batches I get bored of snakes, and I always give the last batch of snakes red eyes.

This proceeds much as the previous example, except P(red) starts at 100% when N=1 and then decreases as N increases. Under SSA anthropics it has a limit at >=50%. Under SSI anthropics we can't take the limit.

Conclusion

Under SSA anthropics, we have shown that one of these possibilities is true:

  • There is a defined probability, and it is >= 50%.
  • There is not a defined probability, and different limiting processes result in different answers.

Under SSI anthropics we were not able to find a limit using any process. It's possible that more advanced techniques will be able to find limits that work. I think the most plausible answer is that it is undefined.

Resolution of a Manifold Market

See Manifold Markets: Is the probability of dying in the Snake Eyes Paradox 1/36?

The manifold market has a FAQ that specifies that we can use the "limit of a blue-loving deity" limiting process above. As discussed above, this means that we must be using SSA anthropics, as the limit is not valid under SIA anthropics. Also as shown above, the limit under SSA anthropics is > 1/36. Therefore the market should resolve NO. Further investigation may provide a tighter bound but is not needed to resolve the market.

17

New Comment
11 comments, sorted by Click to highlight new comments since: Today at 1:12 PM

Because I'm not a real mathematician, I'm not going to find the actual limit, but just show that the limit is at least 50%.

Note that 1 + 2 + 4 + ... + 2^(n-1) = 2^n - 1.  Therefore, if we have a bunch of blue-eyed groups of size 1, 2, 4, ..., 2^(n-1), and one red-eyed group of size 2^n, then the overall fraction of snakes that are red-eyed is 2^n / (2^n + 2^n - 1), which, if we divide the numerator and denominator by 2^n, comes out to 1 / (2 - 1/(2^n)).  This is slightly above 1/2, and the limit as n -> ∞ is exactly 1/2.

This is exactly right under SIA, thanks. Under SIA, almost all the snakes exist in the final scenario, and therefore the limit is 50% as n -> infinity.

Under SSA it's a bit higher than 50%, because we always have a 1/36 chance of there being a single red-eyed snake.

It would be useful to have a summary (at the top or bottom) that this post is mostly about comparing SSA and SIA, and readers familiar with the differences can skim most of it.  

IMO, the key is to remember that probability is in the map (or the agent's head, if you prefer), not the territory.  Unless you're talking about God (who famously doesn't play dice, but kind of seems to), all probabilities are 1 or 0 - that is the universe you're in, or it isn't.  Your assignment of probability is based on your prediction of which universe you're in (ignoring logical uncertainty for this comment).  Which means you really need to ask about what future experience OF THE AGENT is being predicted by the probability calculation of the agent.

From the experiment-runner's perspective, half of players die, and any given player AFTER the experiment, without knowing what group the player is in, has 50% chance of death.  From a PLAYER's perspective, they have a 1/36 chance of death if the experiment is still open, and a 0% chance of death if it's finished and they're alive to answer the question.

Interesting. Using the snake-creating deity setting, what should I expect as a newly created sightless snake, waiting for my eyes? Suppose that the deity will answer my questions while I wait for the dice roll.

SIA: I expect that the deity has created an indescribably large number of batches, and has not rolled snake eyes yet. I expect that there is a 1/36 chance that they will roll snake eyes this time. If they don't, they will likely roll up more batches, and those probabilities will be pretty normal. And then I'll end up on a indescribably large world with an indescribable number of snakes, of which 50% are red-eyed.

SSA: I expect that the deity has created several batches of snakes, within the expected bounds of a Poisson distribution with P=1/36. I expect that there is a 50% chance that they will roll snake eyes this time, because while the dice are fair, if the deity rolls snake eyes then I am about 35x more real. And then I'll end up on a large world with lots of snakes (eg, 2^36), of which just over 50% are red-eyed.

So under SIA the past is shaped by anthropics, and under SSA the future is shaped by anthropics. And whatever happens I get very compelling evidence on the SSA vs SIA question.

I feel like this is one of the cases where you need to be very precise about your language, and be careful not to use an "analogous" problem which actually changes the situation.

 

Consider the first "bajillion dollars vs dying" variant. We know that right now, there's about 8B humans alive. What happens if the exponential increase exceed that number? We probably have to assume there's an infinite number of humans, fair enough.

What does it mean that "you've chosen to play"? This implies some intentionality, but due to the structure of the game, where the number of players is random, it's not really just up to you. 

 

NOTE: I just realized that the original wording is "you're chosen to play" rather than "you've chosen to play". Damn you, English. I will keep the three variants below, but this means that the right interpretation clearly points towards option B), but the analysis of various interpretations can explain why we even see this as a paradox.

A) One interpretation is "what is the probability that I died given that I played the game?", to which the answer is 0%, because if I died, I wouldn't be around to ask this question. 

B) Second interpretation is "Organizer told you there's a slot for you tomorrow in the next (or first) batch. What is the probability that you will die given that you are going to play the game?". Here the answer is pretty trivially 1/36. You don't need anthropics, counterfactual worlds, blue skies. You will roll a dice, and your survival will entirely depend on the outcome of that roll.

C) The potentially interesting interpretation, that I heard somewhere (possibly here) is: "You heard that your friend participated in this game.  Given this information, what is the probability that your friend died during the game?". The probability here will be about 50% -- we know that if N people in total participated, about N/2 people will have died.

 

 

Consider now the second variant with snakes and colors. Before the god starts his wicked game, do snakes exist? Or is he creating the snakes as he goes? The first sentence "I am a god, creating snakes." seems to imply that this is the process of how all snakes are created. This is important, because it messes with some interpretations. Another complication is that now, "losing" the roll no longer deletes you from existence, which similarly changes interpretations. Let's look at the three variants again.

A) "What is the probability you have red eyes given that you were created in this process?" -- here the answer will be ~50%, following the same global population argument as in variant C of the first variant. This is the interpretation you seem to be going with in your analysis, which is notably different than the interpretation that seems to be valid in the first variant.

B) If snakes are being created as you go with the batches, this no longer has a meaning. The snake can't reflect on what will happen to him if he's chosen to be created, because he doesn't exist.

C) "Some time after this process, you befriended a snake who's always wearing shades. You find out how he was created. Given this, what is the probability that he has red eyes?" -- the answer, following again the same global population argument, is ~50%

 

 

In summary, we need to be careful switching to a "less violent" equivalent, because it can often entirely change the problem.

I definitely agree on the need for care in switching between variants. It can also be helpful that they can "change the situation" because this can reveal something unspecified about the original variant. Certainly I was helped by making a second variant, as this clarified for me that the probabilities are different from the deity view vs the snake view, because of anthropics.

In the original variant, it's not specified when exactly players get devoured. Maybe it is instant. Maybe everyone is given a big box that contains either a bazillion dollars, or human-eating snakes, and it opens exactly a year later.

In my variant, I was imagining the god initially created a batch of snakes with uncolored eyes, then played dice, then gave them red or blue eyes. So the snakes, like the players, can have experiences prior to the dice being rolled. And yes, no snakes exist before I start. (why is the god wicked? No love for snakes...) I'll update the text to clarify that no snakes exist until the god of snake creation gets to work.

C) "Some time after this process, you befriended a snake who's always wearing shades. You find out how he was created. Given this, what is the probability that he has red eyes?" -- the answer, following again the same global population argument, is ~50%

I think this is a great crystallization of the paradox. In this scenario, it seems like I should believe I have a 1/36 chance of red eyes, and my new friend has a 1/2 chance of red eyes. But my friend has had exactly the same experiences as me, and they reason that the probabilities are reversed.

Is it epistemically isomorphic to sleeping beuaty problem?

The SSA vs SIA debate impacts both questions, but once you pick one of those then in Sleeping Beauty there's a clear answer of 1/2 or 1/3, whereas in this problem the infinities continue to make it unclear what the probability should be.

It depends.

If croupier choose the players, then players learn that they were chosen, then croupier roll the dice, then players either get bajillion dollars each or die, then (if not snake eyes) croupier choose next players and so on - answer is 1/36.

If croupier choose the players, then roll the dice, then (if not snake eyes) croupier choose next players and so on, and only when dice come up snake eyes players learn that they were chosen, and then last players die and all other players get bajillion dollars each - answer is about 1/2.

Are all snakes equally real (in proportion to their likelihood of existing)? I'm assuming yes in the post, but that leads to weirdness like undefined probabilities, which we'd prefer to avoid. Also, because there are infinite snakes, there would be a zero probability of being any particular snake, or (under SIA) being in any particular batch of snakes.

If we distribute realityfluid to the snakes in differing amounts, to avoid that, then we can probably avoid the undefined probabilities. The probability of having red eyes will depend on the formula for distributing realityfluid. Then instead of saying "the probability is undefined" we can say "the probability depends on this undefined formula".

Also, because there are infinite snakes, there would be a zero probability of being any particular snake, or (under SIA) being in any particular batch of snakes.

Hot take, but this is actually a valid answer, and in infinite sets/infinite outcome spaces this can actually start mattering.

The classic example is if you pick a number from the number line at random, you will have a 0% chance of picking a rational number, or any number that isn't a real number.

New to LessWrong?