Eliezer has proposed a puzzle:

So here's a simple algorithm for winning the lottery:

Buy a ticket.  Suspend your computer program just before the lottery drawing - which should of course be a quantum lottery, so that every ticket wins somewhere.  Program your computational environment to, if you win, make a trillion copies of yourself, and wake them up for ten seconds, long enough to experience winning the lottery.  Then suspend the programs, merge them again, and start the result.  If you don't win the lottery, then just wake up automatically.

In response, rwallace has proposed a reductio of subjective probability:

So you decide to create your full allowance of 99 copies, and a customer service representative explains how the procedure works: the first copy is made, and informed he is copy number one; then the second copy is made, and informed he is copy number two, etc. That sounds fine until you start thinking about it, whereupon the native hue of resolution is sicklied o'er with the pale cast of thought. The problem lies in your anticipated subjective experience.

After step one, you have a 50% chance of finding yourself the original; there is nothing controversial about this much. If you are the original, you have a 50% chance of finding yourself still so after step two, and so on. That means after step 99, your subjective probability of still being the original is 0.5^99, in other words as close to zero as makes no difference.

I think they have both missed the simple-minded and "normal" approach to such problems. Let's hypothesize this:

1) If a single universe contains N identical copies of your current information state (possibly existing at different times and in different places), and some event will happen to K of them, then you should assign probability K/N to that event.

2) If there are multiple universes existing with different "measure" (whatever that is), then your prior probability of being in a certain universe is proportional to its "measure", regardless of the number of your copies in that universe, as long as it's >0.

In Eliezer's puzzle, my assumptions imply that you cannot win the lottery by using anthropic superpowers, because making many copies of yourself in the winning branch only splits the "observer fluid" in that branch, not creates more of it overall.

In rwallace's puzzle, my assumptions imply that your probability of still being the original after 99 copyings is 1/100, if you didn't receive any indexical information in the meantime. The reason: spacetime contains 100 copies of you about to be told who they are, all of them informationally equivalent. The physical fact of which copy was made from which is irrelevant, only information matters. For example, if A is copied into B and then B is copied into C without anyone getting indexical info, the second act of copying also pulls some of A's "observer fluid" into C, so they end up with 1/3 each, instead of 1/2 1/4 1/4.

Now the disclaimers:

I know that speaking of things like "reality fluid" is confused and that we know next to nothing. I don't know if my idea carries over to other puzzles. I don't know how well it matches our reality and how it might follow from physics. I don't know what happens when observers get deleted; maybe killing someone without giving them indexical information just redistributes their fluid among surviving identical branches ("merging"), but maybe it just gets lost forever. I don't know what counts as a copy of you. I don't know how to count copies and whether thickness of computers matters. I don't know how to determine if one observer-moment is a continuation of another observer-moment; maybe it's about correct stepping of algorithms, maybe something else. These are all open questions.

(Thanks to Wei Dai and Manfred for discussions)

New to LessWrong?

New Comment
20 comments, sorted by Click to highlight new comments since: Today at 8:25 AM

Quantum mechanics makes no mention of "multiple universes" or the measure of a universe.

Intuitively, I can draw a distinction between having two copies of me in the same Everett branch and having one copy in each of two Everett branches (as your proposal would). Physically, though, this is no more valid than speaking of the continuity of the atoms in my brain. I can interpolate smoothly between these two states of the universe. What do we say about the intermediates?

If one situation can be continuously transformed into another, that doesn't imply that they are equivalent. The relevant features (observer-weight, etc.) can also change continuously, though I don't know how they will change in the situation you described.

What continuity is there between having another copy in the same Everett branch and in a different Everett branch?

Most universes you can describe don't break up nicely into Everett branches. My point was that in between having two of you in one branch and having one each in two, there are many less well divided worlds in which your two copies have complicated relationships (you can go between them smoothly, though its not informative). It seems like you should also make predictions in these intermediates.

Can't we define an operator counting the number of humans and enumerate its eigenstates, at least in principle?

[-][anonymous]13y00

Could you show an experiment where "two copies in the same branch" and "a copy in each of two branches" behave differently, according to the proposal?

if you didn't receive any indexical information in the meantime.

What if you did receive indexical information? Can you then use the process to make nigh-p-zombies who have an extremely low subjective probability?

I think any theory of "observer fluid" implies that people who narrowly escaped death multiple times are nigh-p-zombies. Maybe that's a simple reductio of the whole idea, I'm not sure.

Not entirely. I have a theory of observer fluid that's that people who are too similar only count as close to one person. You can't really take advantage of it, because if you hurt just one, they're now a different person.

This is mainly because I dislike the idea that you could be more of a person just by having bigger neurons. It seems equivalent to the idea that two people who act exactly the same count as one.

It still seems pretty bizarre, so I'm hesitant to accept it.

If you make copy, then inform both original and the copy of their states("You're the original" "You're the first copy"), and then proceed to make new copy of the original, information equivalence exists only between copy number 2 and the original, making it back to 1/2, 1/4, 1/4

Yes, I know :-)

the second act of copying also pulls some of A's "observer fluid" into C

I just wrote a comment in response to rwallace's problem which is similar in effect to this, but has a more intuitive justification.

Moving to discussion because it doesn't seem to fare very well as a post on LW proper.

So you never update on information, except by eliminating universes that do not contain any agentts with your information state?

What about large universes that produce you only by chance?

I don't completely understand your question. Of course I update on information that concerns my instances in a single universe: in the absence of new copying the rules reduce to ordinary probability theory, as far as I can see. Ditto if I'm in a small universe that undergoes a quantum branching, the act of observation removes me from one of the branches. As for large universes, Boltzmann brains, etc., I don't think such matters are sufficiently settled to serve as decisive counterexamples right now.

Let's say there are two universes with 100 copies of you.

In the first, 99% will soon enter information state A and 1% state B In the second, 1% will enter state A and 99% state B.

You currently estimate 50/50 chances.

Then you find yourself in state A. What chances do you estimate of the first universe?

(Based on my position described in this comment.)

The first universe, as ever, has 50% probability (assuming that 50% in the problem statement referred to probability, and not just anticipation, or assuming that the two universes are sufficiently similar for indifference prior to set 50%). Anticipation of being in the first universe is probably greater than for being in second, but it's unclear to what extent, since it's a heuristic measurement that isn't based on any simple rules, and has no normative imperative to be based on simple rules. There seems to be no reason to privilege 99% in particular, unless the copies operate independently and each copy has the same expected impact on overall utility which accumulates additively, so that the presence of copies in the first universe introduces 99 times more value than presence of the copy in the second universe.

Hmm. 99%, I think. Sorry, my brain had a hiccup and I omitted the word "prior" from the 2nd rule for some reason. Now it's in.

There are 99 copies with your information state in one universe.

1 copy with your information state in the other.

To make those be equally valuable, you have to divide by the number of copies of you, not by the number of copies with your information state.

[-][anonymous]13y00

99%, I think. Sorry, my brain had a hiccup and I omitted the word "prior" from the 2nd rule for some reason while writing the post. Now it's in.