In the I-Less Eye, R Wallace considers a situation where an entity clones itself a hundred times that leads to a surprising paradox. I'll argue that there's a rather simple flaw with the argument made in the linked article, but first I'll summarise the riddle.

Suppose you are the original. After you've cloned yourself, you now have a 50% chance of being the original and 50% being the clone. This should apply regardless of what clones already exist, so after cloning yourself 100 times, you should have a 1/2^100 chance of being the original.

On the other hand, this seems strange. Intuitively, it seems as though you ought to have a 1/101 chance of being the original as there are 101 copies. Further, why does cloning one at a time give a different answer from creating all 101 clones at once?

Solution

In order to solve this riddle, we only have to figure out what happens when you've been cloned twice and whether the answer to this should be 1/3 or 1/4. The first step is correct, the subjective probability of being the original should be 1/2 after you've pressed the cloning button once. However, after we've pressed the cloning button twice, in addition to the agents who underwent that first split, we now have an agent that falsely remembers undergoing that split.

Distributing the probability evenly between the agent's who either had that experience or remember it: we get a 1/3 chance of being a false memory and a 2/3 chance of it being a real memory. If it is a real memory, then half of that - that is a 1/3 - is the probability of being the original and the other half - also 1/3 - is the chance of being the first clone.

So, the answer at the second step should be 1/3 instead of 1/4. Continued application will provide the answer for 100 copies. I'll admit that I've more sketched out the reasoning for the 1/n solution instead of providing a formal proof. However, I would suggest that I've sufficiently demonstrated that halving each time is mistaken as it assumes that each remembered split is real.

However, we can verify that the 1/n solution produces sensible results. Exactly two agents experience the process of each split (but more agents remember it). So there is a 2/n chance of you experiencing the process and a 1/n chance of you experiencing it as the original. So there is a 50% chance of you "waking up" as the original after the split if you actually underwent the split and didn't just falsely remember it.

Please note that I'm not taking a position in this post as to whether subjective probabilities ultimately are or aren't coherent, just arguing that this particular argument is fallacious.

Finally, I'll note a few questions that this opens up. If we have to include all agents who remember a situation in anthropic reasoning and not just those who experience it, what actually counts as remembering a situation? After all, in the real world, memories are always imperfect? Secondly, what if an agent has a memory, but never accesses it? Does that still count?

EDIT: As you can see from my response to the comments, this post has some issues. Hopefully, I am able to update it at some point, but this isn't an immediate priority.

New to LessWrong?

New Comment
16 comments, sorted by Click to highlight new comments since: Today at 1:06 PM

This is also interesting if you imagine that you and your other copies don't all have the same memories - instead, each person remembers how many times they've been copied. So the first copy never remembers getting copied, and then the second copy remembers getting copied once, etc. Now the probabilities are no longer so obvious, because the only straightforward probability is the 50% probability the two most recent yous have that they're the other one - everyone else is distinguishable. The "probability" is instead some measure of caring, or personal identity transfer (to the extent that one believes that there is some kind of identity-fluid that obeys locality and conservation when getting transferred around).

There's a standard way to elicit practical 'probabilities' via bets in these situations: play for charity (with independent bets). In order to have something to bet about, let's turn the memories back off, and ask people to bet on whether they're the original (perhaps you have a tattoo on the back of your neck that doesn't get copied), betting with their favorite charity's money. At what odds should they make these bets - when does the charity start making money if your decision procedure says to bet? It's at 1:N odds, not 1:2^N odds.

Again, we should contrast this with the case where you have some family heirloom in a box, and whenever you're duplicated you put the box at random in the room of one of your two causal descendants. If only the original gets duplicated 100 times, the box really is in the first copy's room with 50% probability and in the original's room with 1 in 2^100 probability. So if personal identity behaves like physical stuff that gets passed on to your duple descendants, changing how you make copies changes the distribution of the stuff.

Another variation might be to Sleeping Beautify this problem - change the duplication into memory erasure. This makes more sense if we use the variation where you remember how many times you've been copied. Suppose the scientists who perfected the memory erasure pill used on Sleeping Beauty are now working on a selective memory restoration pill. Every day, your memories of the last 24 hours will be erased. But every two days, the scientists will flip a coin - if heads, restore your memory from one day ago, if tails, restore your memory from two days ago. In this way, every two days your memories advance by one day.

This procedure doesn't seem, to me, to change any relevant probability about the situation. On each day, you know that you have a 50% chance of being the one with the causal descendant. So becoming the "oldest" means winning a series of coin flips. But as I noted above, in addition to probability, this problem is asking us about the measure of caring, or hypothetical identity-fluid, and the memory erasure version totally changes our intuitions about how that gets sloshed around.

"This is also interesting if you imagine that you and your other copies don't all have the same memories - instead, each person remembers how many times they've been copied" - I didn't assume that all of the copies have the same memories; just that all copies have a real or false memory of the first copy. Everyone is distinguishable except for the original and the latest copy, but how does it break this analysis? (Actually I realised that I was less clear on this than I could have been, so I edited the post)

For each split, if you are really experiencing this event you have a 50% chance of being the original and 50% of being the latest clone. However, regardless of how many clones remember the event, there is only a 2/n chance of actually experiencing the splitting as either the original or the new clone and so splitting this 50/50 gives a 1/n chance of being either.

Probability according to who? Who is making a prediction that can be either true or false, and their state of information leads them to assign this probability?

For example, suppose I get cloned twice, once today and once tomorrow, with no loss of memories or awareness. The only special thing about the cloning device is that it won't copy the tattoo I have on the back of my neck. Right after I get cloned today, I will think there's a 50/50 chance I have the tattoo. Then maybe I'll go home and check in the mirror and my P(tattoo) will go close to 0 or 100%. Then tomorrow, the person with the tattoo will go get cloned again - thus learning that they have the tattoo even if they forgot to check in the mirror. Meanwhile, the person with no tattoo sits outside in the waiting room. After this second cloning, the person in the waiting room still knows they don't have the tattoo, while the two people who just got cloned think they have a 50% chance of having the tattoo.

At no point in this story does anyone think they have either a 1/3 or 1/4 chance of having the tattoo.

"After this second cloning, the person in the waiting room still knows they don't have the tattoo, while the two people who just got cloned think they have a 50% chance of having the tattoo" - What's the setup? If it is known that the person with the tattoo is always the one being cloned, the first person in the waiting room can deduce that they don't have the tattoo when the second person walks into it. So it is only the most recent clone who has a 50/50 chance of having the tattoo, unless I'm misunderstanding the problem?

Yes - what you're calling "the original," I'm calling "the person with the tattoo" (I'm choosing to use slightly different language because the notion of originalness does not quite carve reality at the joints). So, as in R Wallace's thought experiment, the person with the tattoo is the person getting cloned over and over.

One might claim that there is a further probability in this situation - the probability you should assign before the whole process starts, that you will in some sense "end up as the person with the tattoo." But I think this is based on a mistake. Rhetorically: If you don't end up as the person with the tattoo, then who does? A faerie changeling? There is clearly a 100% chance that the person with the tattoo is you.

Regarding the bet with charity - the only reason the bet with charity (as I mentioned in the case where none of you have any memories that indicate which copy you might be) a 1/N result rather than a 1/1 result is because all copies of you have to make the bet, and therefore the copies with no tattoo pay off money. If, the day before the cloning, you bet someone that you could show up tomorrow tattoo and all - well, you could do that 100% of the time.

As I mentioned above, it's perfectly valid to have a degree of caring about future copies, or a degree of personal identification. But the naive view that "you" are like a soul that is passed on randomly in case of cloning is mistaken, and so is the notion of "probability of being the original" that comes from it.

What if both clone and original are told which one they are right after cloning? Then probability of being told that you are original twice is still 1/4.

Assume this real life scenario:

100 refugees are met by the king of host country, who says, only 1 of you will become our citizen and other 99 will be slaves. The procedure of selecting the citizen is as follows: we choose 2 of you randomly, then coin is tossed - the looser becomes slave and the winner goes for second round against another fellow randomly selected from remaining 98 and so on. The one who wins the last coin toss becomes citizen.

In this setting if you are selected 1st then you have close to 0 chance to become citizen, as if you are selected last you have 50%. The game is unfair for 1st guys being selected same way as it is unfair for the original in cloning scenario.

Again, that doesn't make a difference. Everyone either experienced the first splitting or has a false memory of it and the Citizenship or Slavery ignores this

What difference does memories make if you are already been told whether you are clone or original? I fail to understand this reasoning.

Another similar scenario:

Lets say laws are such that after cloning, both original and clone splits all the money that original has. Now first clone gets 50% of all wealth, second clone gets 25% and so on, while original is left with next to zero after 100 splits. That is same unfairness as in original problem, just instead of probability of having all the money you get the corresponding fraction of the money. There is no way for you to remain with 1% of your money if you are the one who keeps getting split.

Probability is a model of anticipated experiences and can be tested by measuring frequency of an event they predict, whether in a real or in a thought experiment. What experiment would you set up to test your model that the odd of being the original is 1%?

I am interested in this topic in general, but for this specific post I'm merely trying to respond to an argument that assumes there is such a thing as "the probability". Often it is best to address an argument by taking certain assumptions for granted and explaining that even given those assumptions the argument doesn't follow.

Edit: Actually, thinking about this in terms of bets is more useful than I thought as per Charlie Steiner's comment.

Another way to answer to this problem is applying indexical information changes tracking (known to the experimenter).

1) If he presses the button every millisecond, all 101 copy will have the same memories and the same indexical information, so the answer is 1/101.

2) If he presses the button every hour, his timestap changes as well as his indexical information about each pair of copies, so his probability to be the last copy is 1/2^100.

"So his probability to be the last copy is 1/2^100" - This would contradict the answer I've given here. What about my analysis do you disagree with?

My objection is that copies are different and they know it, so we can't apply something like SSA to the false memory. First copy is remembering pressing the button only 1 time, and two second copies are remembering pressing it 2 times. So each copy knows which class of copies he belongs.

If I correctly understand you, copies are reasoning only based on memories of the first button pressing, but are ignoring memories of second pressing - is it right?

Consider after the second pressing:

  • Based on remembering the first pressing, there is a 2/3 chance that this really happened and a 1/3 chance that this is a false memory
  • Based on remembering the second pressing, there's a 1/3 chance that you are newly created and your memory of the first pressing is false and a 1/3 chance that you are the original. If you don't remember it you know that you must be the first clone

So both memories are used to figure out what happened.

I think what avturchin is getting at is that when you say “there is a 1/3 chance your memory is false and a 1/3 chance you are the original”, you’re implicitly conditioning only on “being one of the N total clones”, ignoring the extra information “do you remember the last split” which provides a lot of useful information. That is, if each clone fully conditioned on the information available to them, you’d get 0-.5-.5 as subjective probabilities due to your step 2.

If that’s not what you’re going for, it seems like maybe the probability you’re calculating is “probability that, given you’re randomly (uniformly) assigned to be one of the N people, you’re the original”. But then that’s obviously 1/N regardless of memory shenanigans.

If you think this is not what you’re saying, then I’m confused.

Firstly, what's 0-.5-.5 mean?

Secondly, you're right about conditioning on the last split. The original and last clone each think that they have a 50% chance of being the original and everyone knows that they aren't.

Given this, it's tough making sense of the problem posed in the original post. Maybe the question isn't asking about the probability of the original knowing that they are the original at the end, but the chance of someone who thinks they might be the original (including those with false memories) turning out to be the original. Of course it is hard to define exactly what time we are asking about since some of these memories are false. It seems like we need to define some kind of virtual time for it to even make sense. But once this is surmounted, it should be 1/n.

Again, I should be clear, this is one part of anthropics where my ideas are less developed. I think this post will have to be edited once I have a more comprehensive theory.