The subject of copying people and its effect on personal identity and probability anticipation has been raised and, I think, addressed adequately on Less Wrong.

Still, I'd like to bring up some more thought experiments.

Recently I had a dispute on an IRC channel. I argued that if some hypothetical machine made an exact copy of me, then I would anticipate a 50% probability of jumping into the new body. (I admit that it still feels a little counterintuitive to me, even though this is what I would rationally expect.) After all, they said, the mere fact the copy was created doesn't affect the original.

However, from an outside perspective, Maia1 would see Maia2 being created in front of her eyes, and Maia2 would see the same scene up to the moment of forking, at which point the field of view in front of her eyes would abruptly change to reflect the new location.

Here, it is obvious from both an inside and outside perspective which version has continuity of experience, and thus from a legal standpoint, I think, it would make sense to regard Maia1 as having the same legal identity as the original, and recognize the need to create new documents and records for Maia2 -- even if there is no physical difference.

Suppose, however, that the information was erased. For example, suppose a robot sedated and copied the original me, then dragged Maia1 and Maia2 to randomly chosen rooms, and erased its own memory. At this point, neither either of me, nor anyone else would be able to distinguish between the two. What would you do here from a legal standpoint? (I suppose if it actually came to this, the two of me would agree to arbitrarily designate one as the original by tossing an ordinary coin...)

And one more moment. What is this probability of subjective body-jump actually a probability of? We could set up various Sleeping Beauty-like thought experiments here. Supposing for the sake of argument that I'll live at most a natural human lifespan no matter which year I find myself in, imagine that I make a backup of my current state and ask a machine to restore a copy of me every 200 years. Does this imply that the moment the backup is made -- before I even issue the order, and from an outside perspective, way before any of this copying happens -- I should anticipate subjectively jumping into any given time in the future, and the probability of finding myself as any of them, including the original, tends towards zero the longer the copying machine survives?

 

New Comment
49 comments, sorted by Click to highlight new comments since:

The subject of copying people and its effect on personal identity and probability anticipation has been raised and, I think, addressed adequately on Less Wrong.

It hasn't been addressed adequately. Probability anticipation is a big mystery. Our best idea (UDT) sidesteps the problem entirely. I think it's fair to say that no one on LW really knows how anticipation will behave when people start making copies of themselves - and you won't even be able to learn that by polling the copies! The only way for anyone to learn the correct law would be to run experiments on themselves and observe the frequencies.

The only way for anyone to learn the correct law would be to run experiments on themselves and observe the frequencies.

That's what I was going to propose, after Emile said he doesn't know what the probabilities mean.

If we adopt a frequentist interpretation, then I suppose the probability makes sense. After all, this is the only way we can actually observe the Born probabilities in quantum mechanics: by repeating the experiment and counting frequencies.

But.

Even if I ran the experiment on myself, what would make my subjective report any more valid than that of the other copies? In quantum mechanics, even assuming MWI, the opinions of our copies in the other branches are inaccessible to us. Suppose I copied myself 10 times. Then the resulting 1024 copies will have every possible subjective experience generated among them. I guess we could say that what the 50% probability really means is that the resulting copies would obey the same binomial distribution that you would have after generating every possible sequence of 10 coin flips.

That's what I was going to propose, after Emile said she doesn't know what the probabilities mean.

(I'm a he)

Fixed, sorry. Just seemed like a female name to me, no offense meant. :)

Yeah, Emily is much more common in English and even in French.

The only way for anyone to learn the correct law would be to run experiments on themselves and observe the frequencies.

Not yet. You can't understand what's going on just by running experiments, you also have to understand how to interpret the data.

The closest thing I've experienced to a situation where my subjective experience needs to account for anthropic reasoning is when I'm really tired and fall asleep in the afternoon, then wake up at some time that is either early or late.

For example, if someone tells me that its sometime between 7 or 9 (without PM/AM), then its plausible for me to be waking up in PM, or in the AM. If its PM its fine, but if its AM I should panic, as I've overslept and will be late for school.

Two notes: I mostly wake up in the PM in these situations. Like, at least 90% of the time. In light of that, I'm pretty much always wrong, guessing that I've overslept.

Here, it is obvious from both an inside and outside perspective which version has continuity of experience, and thus from a legal standpoint, I think, it would make sense to regard Maia1 as having the same legal identity as the original, and recognize the need to create new documents and records for Maia2 -- even if there is no physical difference.

Personally, I would advise forking the legal identity, rather than creating an entirely new one. It needs to be recognised that Maia2 has as much right to all of Maia's holdings as Maia1 does. And it definitely needs to be recognised that Maia2 possesses all the qualifications Maia did.

It needs to be recognised that Maia2 has as much right to all of Maia's holdings as Maia1 does.

That's a reasonable default assumption, but it also means you lose half your possessions every time you create a duplicate. I wouldn't advise duplication until you previously wrote a will declaring what would happen to your belonging (mine would be something like "the original gets to keep everything", and "if there is no way to determine which is the original, a copy is randomy chosen and considered to be the original").

The problem with allowing such wills (and especially wills which allow specific reference to "the original") is that there will be people who believe they are guaranteed to continue on solely as the original; and who will thus be willing to create copies who're, well, slightly screwed.

You need a lot of protections for the copies. And laws regarding when copying is legal.

Why would such people have a motive for making copies in the first place?

This doesn't need to be a hypothetical question. There are people who read this site, who subscribe to the thread theory of identity. If any of them are reading this comment: would you want to start copying yourself? If so, why? And what percentage of your assets would you allocate to the copy, and why?

I can see one possible answer: make a backup to start running only if I die. The backup wouldn't be me, but would be able to take my place for the benefit of my family and friends, some of whom may subscribe to the pattern theory of identity. But that wouldn't require the backup to be running concurrently.

Another possible answer: create a person who is not me but shares my goals, increasing the probability of those goals being fulfilled. But to be effective at this, the copy will probably need some resources. Do those of you who subscribe to the thread theory of identity, regard this as a useful thing to do?

If you create the backup at a single point and reuse it a potentially infinite number of times, then you become susceptible to the (for lack of a better term) "diverging Sleeping Beauty" scenario described above: you somehow have a uniform probability of waking up between now and the end of time, even though the probability of each specific point tends towards zero as "the end of time" approaches infinity.

Note that this paradox doesn't arise in the classic Sleeping Beauty, even an indefinitely iterated version, because it's constructed as a limit of finite probability distributions and thus always converges.

If you create the backup at incrementally increasing biological age, then at some point you'll have to find a way to cheat death, or accept that your lifespan will be finite even with backups.

Possessions are one thing, but what about people around me? It's not like each copy can receive its own set of parents, friends, love interest(s), etc.

That's more of a reason to not create copies in the first place.

Depends on your love interest(s), some might be thrilled to have several yous.

Well, if we've posited the ability to duplicate people, then presumably they could. That would just exacerbate the situation, though, not resolve it.

That said, it's not like I stop having my parents when I gain a sibling. If tomorrow there exist two copies of me, then both copies have the same parents, and my parents have a new son. So? What has anyone lost?

The same thing is even more true with friends. My friends have other friends, and that's perfectly fine; why should it be less so if one of those other friends is identical to me?

Admittedly, if one is monogamous -- or, more to the point, if one's love interest(s) is (are) -- then a resource problem does exist.

That said, while I'm fairly commitedly monogamous in the real world, I suspect my attitudes on the matter would change in a world where personal identity can be duplicated freely; it would not surprise me if the same were true of my husband.

I agree that Maia2 is as qualified as Maia, but I'm not sure about legal identity. I guess I'm not sure what you mean by "fork".

If the legal identity were the same, and Maia2 were to commit murder immediately after being made, should Maia1 be punished?

If Maia2 were to kill Maia1 immediately after creation, then is it just passing the baton so to speak? Would it just be Maia1 creating Maia2 with the intent of suicide? This could happen in say the lightspeed transportation scheme in which the information needed to construct Maia is sent far away to be constructed on Mars or Europa or something. Maia2 is made in the destination then Maia1 is destroyed, leading to one subjective Maia in the desired location.

I guess I'm not sure what you mean by "fork".

By fork I mean, up to the point of duplication you have Maia. After the point of duplication you have Maia1 and Maia2, both of whom are Maia, but neither of whom are each other.

If the legal identity were the same, and Maia2 were to commit murder immediately after being made, should Maia1 be punished?

Did Maia, prior to duplication, intend the murder? If so, as both Maia1 and Maia2 are Maia, they're both guilty of planning the murder. Only Maia2 committed it, but Maia1 planned it just the same.

If Maia2 were to kill Maia1 immediately after creation, then is it just passing the baton so to speak? Would it just be Maia1 creating Maia2 with the intent of suicide?

Maia1 didn't exist seperately until Maia2 was created. There was Maia, who is the past of Maia1 and Maia2

Subjective anticipation is an approximation that isn't guaranteed to give meaningful results for copied minds, just as Newtonian mechanics isn't guaranteed to give meaningful results for electrons and black holes (though the result seems sensible in the simple case of a single persisting copy).

But as for what the law should say, once we have the technology to do this, it seems to me three principles should apply:

  1. You should specify how your assets are going to be divided, before undertaking the procedure.

  2. Copies do not count as extra people for purposes like voting and receiving social welfare. Instead, these things count among the assets that should be allocated before the copy is made.

  3. If such allocation wasn't done, and nobody knows which one is the original, then the copies should be able to agree on what should be done (divide evenly, flip a coin or whatever), since each has the same values and the same information.

I don't see any good reason to insist that, after the procedure, every asset belongs to exactly one body.

By way of analogy, when I married my husband, a number of assets became legally held in common. There simply is no answer to the question of which of us owns which part of the money in our checking account... we own it jointly; we both have the legal right to dispose of all of it as we choose. Obviously, this creates the potential for conflict, but the legal system doesn't get involved in that conflict... it's our problem.

Of course, marriage is a special relationship. But then, so is identical copyhood.

My initial instinct is for the law to stay out of it, other than to provide a mechanism whereby identical copies can contract with one another and enforce the terms of such contracts.

(And, no, I'm not a libertarian in general, but I do think it works OK as an approach towards relationships among equals.)

I am a libertarian in general, and if you want your assets owned jointly between you and your copies, I certainly have no objection to that. Presumably something indivisible like your vote in elections would then be used by consensus between all copies?

The only caveat I would suggest is that whichever way you want to do it, it's best to make the decision and sign the appropriate documentation before you step into the copying machine. As with divorces and inheritance in our own time, the last thing you want is to run into a dispute after the fact.

Requiring consensus seems unnecessary. If we get one vote between us, then we get to vote once; that's all the legal system has to concern itself with. Everything else is our own problem.

The courts have no interest in whether we agreed on a result, or whether one of us is currently chained to the wall in our basement, or whatever. (Well, the courts may have an interest in the latter for other reasons, but not as it applies to voting.)

I agree with your suggestion; I'm just saying this isn't a legal complication, just a bit of good personal advice. (That said, my husband and I didn't sign a prenuptual agreement when we got married, so my agreement with this advice is clearly relatively superficial.)

The analogy breaks down when you consider that a married couple can define property rights in case of a divorce (and create a legally binding contract) before even marrying, and in fact, can divorce. Whereas the relationship between copies of a person exists for the entire duration of their coexistence.

It seems unfair to preallocate property among otherwise identical copies of a person, but it's certainly physically possible -- and I can't think of any reason why we'd want to prevent copies from negotiating their own ownership of property once their state vectors have diverged. Seems like a reasonable analogy to the economic side of divorce.

This is a pretty science-fictional aside, though.

The legal relationship needn't last forever.

That is, I'd support a pair of copies legally filing for the equivalent of divorce. And just as with divorce, the couple needs to work out a division of assets at that point, perhaps with the "assistance" of a court of law, or a professional arbitrator.

Of course, if they signed a pre-duplication contract, that's fine too, just as with divorcing couples today.

Incidentally, as a matter of law I'd expect a pre-duplication contract to be binding on both parties X and Y if and only if the law recognizes both X and Y as the original. If copy X is not the original, legally speaking, then copy X was not a signatory to that contract and is not bound by it.

What if they can't agree.

For instance, suppose the prior to the procedure I, like many people, believed that I was guaranteed to continue on as the original, so the original, being the one that acquired the assets, should get all of them.

However, the copy, upon realizing they are the copy, quickly revises this opinion and now believes assets should be shared equally (this does not even have to be self-serving, it is easy to imagine that since the copy has the same memories as the original, will have a powerful intuition that they are the original, or at least one of two originals). The original remains unsympathetic to this, since he assumes that's just what a self-interested clone who only wanted to sponge off someone else would say.

That's the same as inheritance when there's no will: should it all go to the eldest son? Be split equally among sons? Should even daughters get a share? How about the widow? Adopted children? Nephews? Siblings? Non-married companions? Lovers? Children born out of wedlock and not formally recognized? It's not rare for the children to disagree on this.

If the parent didn't specify a will, the law usually has a default case, which hasn't been the same from one society to another, or from one epoch to another.

The difference here is that heirs are different people using different reasoning, and they can appeal to various factors depending on their capacity for reasoning and their implicit privileges (age, degree of relation, etc.)

Here we are arguing about a case where the "heirs" are not only not privileged over each other in any intrinsic sense, but they are also copies of the same thought process, with access to the exact same threads of reasoning.

This logic works in the scenario where nobody knows who the original is. It doesn't work as well when there is a known original, since this difference in subjective experience between the two may be enough to break symmetry in terms of which arguments are appealing, and the analogy to heirs probably works best. I myself favour a default case of equal splitting, since it means we don't have to argue about who is the 'original'.

Given that the copies (we suppose) use the same thought algorithms at the forking point, it may be helpful to imagine yourself as either version in such a scenario, or mentally roleplay a dialogue between them.

There are no mystical laws of nature associated with consciousness. Therefore the copy has no magical effect on the original you. Therefore it should appear the same to original you whether the copy is there or whether we just say the copy is there.

Which makes clear that the idea of a "body-jump" is wrong. No soul is leaping forth from the original you to go live in some copy.

So what does this probability of waking up as someone else mean? Well, we can think of your memories, including the memory of this probability, as a "mark" on minds. The probability really asks: given that you have the mark, what is the probability that you are the original? Copying you is just creating more minds with the mark, so if the only thing you know is that you have the mark, you have a 1/N chance of being the original.

The real problem seems to be uncertainty over the definition of "you" given multiple minds with the mark. To which I'd like to refer you to the official reference.

Of course there's no soul. Nor is there any spooky physical action at a distance whenever a copy is created.

What I mean by a body jump is the entirely subjective experience of the copy. Past the branching point, the original will experience a smooth continuation of the stream of consciousness, while the copy experiences a jarring disconnect if no interruption of consciousness occurred at the time of copy - which subjectively feels like a body jump even though it's not what actually happened.

Of course, if you're unconscious (sleeping, tranquilized, etc.) when the scan takes place, the waking experience is symmetric.

Actually, now that I think about it, I'm not at all sure I'd experience a jump. That is, it's entirely possible that both of me would experience a smooth continuation of the stream of consciousness.

After all, my current experience of such a smooth continuity is the output of cognitive algorithms biased towards producing such a coherent experience out of gappy, noisy inputs; it seems plausible that those algorithms would just continue to do the same thing over the branching point without ever constructing an experience of transition.

I think it depends on how obvious you make the difference between their two locations.

Say the copy takes place in a room where I/we/he/I/(the pre-split me) can watch the copy be created standing in the same relative position, etc. as pre-split-unambiguous-me. Then its entirely possible that both of me would experience a smooth continuation of stream of consciousness.

Now lets say the copy is created in a green room on the other side of a pane of one-way glass, while pre-split me stands watching in a red room. Then I would really hope that one of me would continue a smooth stream of consciousness, while the other one would realize that he "jumped" into a green room.

(nods) I'd really hope so too. I'd even expect it, I think. That said, brains do some astonishingly broken things along these lines.

If the copy cannot see the original at the time of copying, would it feel any different than just being teleported to the destination? Which is impossible because it violates special relativity, but we can still, I think, argue what the subjective experience would be if it was.

Well, in mundane cases where awareness of the transition from point A to B is lost -- blackouts, amnesia, failures of attention, etc. -- the experience is sometimes of a sudden translation, and sometimes of nothing in particular (that is, you're just at B now, and the fact that you don't remember getting there isn't particularly called to your attention), and sometimes of not realizing that anything in particular has happened, and sometimes other things.

I imagine you'd get a similar range in these more speculative cases.

50% subjective probability seems right to me, though I'm not sure if that probability actually means anything, beyond the fact that I shouldn't be surprised in either case.

"Subjective probability" is confusing: say the machine makes 1000 copies (sedating you etc. so you don't know which one you are), puts one in a red room and the remaining 999 in blue rooms. What is my subjective probability of awakening in a blue room? Does that change if I know the 999 copies in the blue rooms are exactly identical computer simulations (the same code executing the same way)?

Since I don't know a non-confusing interpretation of "subjective probability", I'd rather stick to discussing probabilities in terms of bets.

Does that change if I know the 999 copies in the blue rooms are exactly identical computer simulations (the same code executing the same way)?

My first impulse would be to say that it would make no difference. On the macroscopic level, distinct is distinct, and you can't make macroscopic objects (like two different brains or CPUs) exactly identical. Even if the software running is exactly the same, it will still experience minor variations in runtime and such.

On the other hand, this is where it gets iffy: we can do crazy things with software that we can't do with brains. What if the code for Simulated Sleeping Beauty contains parts that split and rejoin threads? The copying thought experiment already violates one of our intuitive assumptions about personal identity: namely, that it's unique. Interactions between different copies also violate our intuition that it's independent of other minds.

This reminds me of the story in Where Physics Meets Experience

Yu'el nods. "I can even see some of the troubles myself. Suppose you split brains only a short distance apart from each other, so that they could, in principle, be fused back together again? What if there was an Ebborian with a brain thick enough to be split into a million parts, and the parts could then re-unite? Even if it's not biologically possible, we could do it with a computer-based mind, someday. Now, suppose you split me into 500,000 brains who woke up in green rooms, and 3 much thicker brains who woke up in red rooms. I would surely anticipate seeing the green room. But most of me who see the green room will see nearly the same thing - different in tiny details, perhaps, enough to differentiate our experience, but such details are soon forgotten. So now suppose that my 500,000 green selves are reunited into one Ebborian, and my 3 red selves are reunited into one Ebborian. Have I just sent nearly all of my "subjective probability" into the green future self, even though it is now only one of two? With only a little more work, you can see how a temporary expenditure of computing power, or a nicely refined brain-splitter and a dose of anesthesia, would let you have a high subjective probability of winning any lottery. At least any lottery that involved splitting you into pieces."

De'da furrows his eyes. "So have you not just proved your own theory to be nonsense?"

"I'm not sure," says Yu'el. "At this point, I'm not even sure the conclusion is wrong."

Yes, the Ebborians were, among other things, one of the inspirations for this post. I just didn't see these particular thought experiments raised there.

[-]ata00

What is this probability of subjective body-jump actually a probability of? We could set up various Sleeping Beauty-like thought experiments here. Supposing for the sake of argument that I'll live at most a natural human lifespan no matter which year I find myself in, imagine that I make a backup of my current state and ask a machine to restore a copy of me every 200 years. Does this imply that the moment the backup is made -- before I even issue the order, and from an outside perspective, way before any of this copying happens -- I should anticipate subjectively jumping into any given time in the future, and the probability of finding myself as any of them, including the original, tends towards zero the longer the copying machine survives?

As long as you correctly anticipate that you will give the machine that order, and that it will carry it out as stated, that sounds right to me. I'll bite that bullet without further qualification. :) Same with the original question about the duplication machine.

What is this probability of subjective body-jump actually a probability of?

I'd unpack the question "What is the probability that my consciousness will suddenly jump into the new body?" as "Of the agents that I currently know to have a mind-state identical to mine, what fraction of them will have an experience that feels like suddenly switching bodies?".

Actually, now that I think of it, you can construct diverging scenarios even in classic Sleeping Beauty. I'll write about it later.

And the problem with making that assessment is that, just like in the classic Sleeping Beauty problem, you have to factor for agents located in your personal future. Even if the copying is said to occur "instantly", in fact no process is completely instant. There will still be some time passed between the snapshot being taken and the copy being constructed.

I admit that it still feels a little counterintuitive to me

Hang on to that feeling, it's your guide to the truth.

In the situation you describe, there is no stream of consciousness which starts where you are now, and which then jumps discontinuously to the stream of consciousness subsequently occurring in the copy's body.

At most, there is a new stream of consciousness, initiated in the newly created copy, which begins with the (illusory) feeling of a jump, because the first moment's experience is being matched against implanted short-term memories.

To believe otherwise is to believe that a single connected stream of subjectivity can supervene on physically discontinuous systems.

Unfortunately, the way people are likely to resolve this problem is to totally deconstruct subjective timeflow and say that in reality, there is no stream of consciousness anywhere ever, there are just disconnected self-moments in the timeless multiverse, et cetera. But in that case all forms of anticipation are illusory.

(My position is that time and change are real, and there is such a thing as continuity of existence in time. Many people here have a problem with that, because of many worlds, and because even single-world physics describes space-time in a static, geometric way. But that's just a limitation of our current conceptual tools, and not a refutation of the phenomenon of time.)

I conclude, after reading this a few times, that I don't really know what you are labeling with the phrases "stream of consciousness," "stream of subjectivity," and "subjective timeflow." I don't know whether you mean to refer to one, two, or three different things, nor how I would recognize that thing (or those things) if I found an instance of it (them) in my oatmeal, or how I could tell if I subsequently lost it (or them).

That said, if it's what I ordinarily understand people to mean by "stream of consciousness", which is roughly speaking a narrative, then I would agree that after duplication there exist more of those things than existed before duplication... for example, if copy X stubs its toe then its narrative includes some analog to "ow!" which copy Y's narrative doesn't include.

So, yes, I'd agree that there's a new stream of consciousness (or perhaps several, depending on how unified the mind being duplicated is) initiated in the newly created copy, though I would say that the corresponding narrative begins significantly earlier than that moment. (I am sympathetic to the claim that the narrative, instantiated in the copy, is fictional prior to that moment. That said, our narratives are sufficiently fictional in the normal case that I'm not sure it makes much difference.)

All of this seems entirely consistent with, for example, a timeless formulation of quantum physics.

It's relevant here that Mitchell believes consciousness is fundamental to quantum mechanics and vice versa.

Mm. Can you unpack the relevance of that?

It explains your confusion: it's not that MP is doing a poor job explaining a point, it's that he believes nonsense.

Ah! I see.