In the I-Less Eye, R Wallace considers a situation where an entity clones itself a hundred times that leads to a surprising paradox. I'll argue that there's a rather simple flaw with the argument made in the linked article, but first I'll summarise the riddle.
Suppose you are the original. After you've cloned yourself, you now have a 50% chance of being the original and 50% being the clone. This should apply regardless of what clones already exist, so after cloning yourself 100 times, you should have a 1/2^100 chance of being the original.
On the other hand, this seems strange. Intuitively, it seems as though you ought to have a 1/101 chance of being the original as there are 101 copies. Further, why does cloning one at a time give a different answer from creating all 101 clones at once?
In order to solve this riddle, we only have to figure out what happens when you've been cloned twice and whether the answer to this should be 1/3 or 1/4. The first step is correct, the subjective probability of being the original should be 1/2 after you've pressed the cloning button once. However, after we've pressed the cloning button twice, in addition to the agents who underwent that first split, we now have an agent that falsely remembers undergoing that split.
Distributing the probability evenly between the agent's who either had that experience or remember it: we get a 1/3 chance of being a false memory and a 2/3 chance of it being a real memory. If it is a real memory, then half of that - that is a 1/3 - is the probability of being the original and the other half - also 1/3 - is the chance of being the first clone.
So, the answer at the second step should be 1/3 instead of 1/4. Continued application will provide the answer for 100 copies. I'll admit that I've more sketched out the reasoning for the 1/n solution instead of providing a formal proof. However, I would suggest that I've sufficiently demonstrated that halving each time is mistaken as it assumes that each remembered split is real.
However, we can verify that the 1/n solution produces sensible results. Exactly two agents experience the process of each split (but more agents remember it). So there is a 2/n chance of you experiencing the process and a 1/n chance of you experiencing it as the original. So there is a 50% chance of you "waking up" as the original after the split if you actually underwent the split and didn't just falsely remember it.
Please note that I'm not taking a position in this post as to whether subjective probabilities ultimately are or aren't coherent, just arguing that this particular argument is fallacious.
Finally, I'll note a few questions that this opens up. If we have to include all agents who remember a situation in anthropic reasoning and not just those who experience it, what actually counts as remembering a situation? After all, in the real world, memories are always imperfect? Secondly, what if an agent has a memory, but never accesses it? Does that still count?
EDIT: As you can see from my response to the comments, this post has some issues. Hopefully, I am able to update it at some point, but this isn't an immediate priority.