I agree that uploading is copying-then-death. I think you're basically correct with your thought experiment, but your worries about vagueness are unfounded. The appropriate question is what counts as death? Consider the following two scenarios: 1. A copy of you is stored on a supercomputer and you're then obliterated in a furnace. 2. A procedure is being performed on your brain, you're awake the entire time, and you remain coherent throughout. In scenario 1 we have a paradigmatic example of death: obliteration in a furnace. In scenario 2 we have a paradigm...
Upload shmupload.
Lets remove unnecessary complications and consider the more essential question. You are knocked out. While unconscious, a particle-for-particle copy of you is made with all identical energy levels, momenta, spins, colors, flavors, and any other quantum states associated with any of the particles in your body. The only differences are all the particles in the new copy are 3 m to the east of the particles in the original. The unconscious copies are placed someplace nice and revived approximately simultaneously.
Pretty obviously, neit...
It's not the book, it's the story.
Moby Dick is not a single physical manuscript somewhere. If I buy Moby Dick I'm buying one of millions of copies of it that have been printed out over the years. It's still Moby Dick because Moby Dick is the words, characters, events etc. of the story and that is all preserved via copying.
A slight difference with this analogy is that Moby Dick isn't constantly changing as it ages, gaining new memories and whatnot. So imagine that Melville got half way through his epic and then ran out of space in the notebook that I want y...
How do you feel about the continuous uploading procedure described in "Staring into the Singularity"?
I don't acknowledge an upload as "me" in any meaningful sense of the term
What about if your mind is uploaded, then downloaded into your clone previously grown without any brain functions? Would you consider the new meat you as "you"?
I find the whole question less confusing when viewed from the other direction. After the upload, the uploaded you will view the current you as it's past. If the upload is nondestructive, the non-uploaded you will also.
I think the expansion and contraction model, as you've described it, would probably also result in my death. The being that includes the computer and myself would be a new being, of which I am now a component. When the meaty component dies, this is my death, even though there is now a being who perceives itself to be a continuation of me. This being is, in many ways, a continuation of me, just not in the way that I care about most.
I'm not completely sure of this, of course, but anywhere I'm not sure whether I'll die or not I prefer to lean heavily towards not dying.
What's the point of uploading if we have an AI with all the skills and knowledge of everyone not information-theoretically dead at the time of its creation?
I have no idea how to argue with the ideas about consciousness/identity/experience/whatever that make uploading seem like it could qualify as avoiding death. It occurs to me, though, that those same ideas sorta make uploading individuals pointless. If strong AI doesn't happen, why not just upload the most useful bits of people's brainstates and work out how to combine them into some collective that is n...
The simplest way to understand all this, is to look others as your coincarnations.
All the paradoxes go away. What remains, is a memmetic hazard, though.
Copy my mind to a machine non-destructively, and I still identify with meat-me. You could let machine-me run for a day, or a week, or a year, and only then kill off meat-me. I don't like that option and would be confused by someone who did.
This is just bizarre. If the point is to preserve continuity, why on earth would you let the copy run independently and diverge? Of course it won't then represent a continuation of experience from the point at which meat-you was later killed.
The point of the destructive upload is precisely so that you-now can antici...
I think we have to give up on uniqueness of identity in order to remain consistent in these kind of sci-fi scenarios.
edit: And I guess "identity" has to have a continuous value to - similar to the anthropic principle - being x% certain you are in a particular world is like being x% certain you are a particular person.
I think that your position on destructive uploads doesn't make sense, and you did a great job of showing why with your thought experiment.
The fact that you can transition yourself over time to the machine, and you still consider it 'you', and you cant actually tell at what specific line you crossed in order to become a 'machine', means that your original state (human brain) and final state (upload) are essentially the same.
I don't like the structure of this argument. If I morph into a coffee table, I can't mark a specific line at which I become a piece of furniture. This doesn't imply that I'm essentially a coffee table. No hard boundary does not imply no transition.
Suppose that rather than copying my brain, I adjoined it to some external computer in a kind of reverse-Ebborian act; electrically connecting my synapses to a big block of computrons that I can consciously perform I/O to. Over the course of life and improved tech, that block expands until, as a percentage, most of my thought processes are going on in the machine-part of me. Eventually my meat brain dies -- but the silicon part of me lives on.
This is very similar to the premise of Greg Egan's short story, "Learning to be me".
I find this an immensely valuable insight: continuity, or "haecceity", is the critical element of self which naive uploading scenarios dismiss. Our current rational situation of self as concept-in-brain has no need for continuity, which is counterintuitive.
We know a good deal about the universe, but we do not yet know it in its entirety. If there were an observer outside of physics, we might suspect they care great deal about continuity, or their laws might. Depending on your priors, and willingness to accept that current observational techniques...
Speaking of uploading procedures, I think the most brute-force, simple in concept and hard in implementation is described in
Transhuman by Yuri Nikitin. Just replace neurons one by one with nanorobots that have identical functionality, then as whole brain is transformed increase working speed.
But what's the difference between "non-destructive upload" and "making a copy of the upload" or "making a copy of your biological body" ?
The intuition behind "Copy my mind to a machine non-destructively, and I still identify with meat-me." is flawed and non-coherent IMHO. What if you can't even tell apart "meat you" and the other one, like the other one is put in a robotic body that looks, feels, ... exactly like the flesh body ? You fall asleep, you awake, there are two "you", one flesh the othe...
Why do you have your position on destructive uploads? It could be that when you go to sleep, you die, and a new person who thinks they're you wakes up. The world is inhabited by day-old people who are deluded by their memories and believe they've lived decades-old lives. Everyone will cease to exist as a person the next time they go to sleep.
If you believe that, I can't prove you wrong. But it's not a productive worldview.
In a world where everyone is uploaded or Star Trek transported each day, you could believe that the world is inhabited by day-old people who will cease to exist on their next transport. I couldn't prove you wrong. But it wouldn't be a productive worldview.
Is there really any important difference between your existence now, and one in which your physical body was replaced by a particle-for-particle copy every so often? To WHOM or WHAT is that difference experienced?
Yes, it is of importance to the me right here, right now, in the present. Under one interpretation I wake up in the other room. In the other I do not - it is some other doppelgänger which shares my memories but whose experiences I do not get to have.
If I somehow find myself in the room with my clone, it's true that there's no way short of checking external evidence like security footage or somesuch to determine which is the real me. That is true. But that is a statement about my knowledge, not the world as it exists. The map is not the territory.
If I were to wake up in the other room with the clone nearby, it no longer matters which one of us is the original or not. He isn't me. He is a separate person that just happens to share all of the same memories and motivations that I have. I want to say that I wouldn't even give this copy of me the time of day, but that would be rhetorical. In some ventures he would be my greatest friend, in others my worst enemy. (Interestingly I could accuratly tell which right now by application of decision theory to the variants of the prisoner's delima.) But even when I choose to interfere in his affairs, it is not for directly self-serving reasons - I help him for the same reason I'd help a really close friend, I hurt him for the same reason I'd hinder a competitor.
The truth has real implications for the me that does exist, in the here and now. Do I spend not-insignificant sums of money on life insurance to cover cryonic preservation for me and my family, thereby foregoing other opportunities? Do I consider assisted suicide and cryonic preservation when I am diagnosed with a terminal or dibilitating disese of the brain? Do I stipulate revival instead of uploading in my cryonics contract, knowing that it might mean never being revived if the technology can not be developed before my brain deteriorates too much? Do I continue to spend time debating this philosophical point with other people on the Internet, in the hope that they too choose revival and there is safety in numbers?
Under one interpretation I wake up in the other room. In the other I do not - it is some other doppelgänger which shares my memories but whose experiences I do not get to have.
I don't understand how to distinguish "the clone is you" from "the clone is a copy of you". Those seem like identical statements, in that the worlds where yon continue living and the world where the clone replaces you are identical, atom for atom. Do you disagree? Or do you think there can be a distinction between identical worlds? If so, what is it?
...He isn't
I don't acknowledge an upload as "me" in any meaningful sense of the term; if I copied my brain to a computer and then my body was destroyed, I still think of that as death and would try to avoid it.
A thought struck me a few minutes ago that seems like it might get around that, though. Suppose that rather than copying my brain, I adjoined it to some external computer in a kind of reverse-Ebborian act; electrically connecting my synapses to a big block of computrons that I can consciously perform I/O to. Over the course of life and improved tech, that block expands until, as a percentage, most of my thought processes are going on in the machine-part of me. Eventually my meat brain dies -- but the silicon part of me lives on. I think I would probably still consider that "me" in a meaningful sense. Intuitively I feel like I should treat it as the equivalent of minor brain damage.
Obviously, one could shorten the period of dual-life arbitrarily and I can't point to a specific line where expanded-then-contracted-consciousness turns into copying-then-death. The line that immediately comes to mind is "whenever I start to feel like the technological expansion of my mind is no longer an external module, but the main component," but that feels like unjustified punting.
I'm curious what other people think, particularly those that share my position on destructive uploads.
---
Edited to add:
Compare a destructive upload to non-destructive. Copy my mind to a machine non-destructively, and I still identify with meat-me. You could let machine-me run for a day, or a week, or a year, and only then kill off meat-me. I don't like that option and would be confused by someone who did. Destructive uploads feel like the limit of that case, where the time interval approaches zero and I am killed and copied in the same moment. As with the case outlined above, I don't see a crossed line where it stops being death and starts being transition.
An expand-contract with interval zero is effectively a destructive upload. So is a copy-kill with interval zero. So the two appear to be mirror images, with a discontinuity at the limit. Approach destructive uploads from the copy-then-kill side, and it feels clearly like death. Approach them from the expand-then-contract side, and it feels like continuous identity. Yet at the limit between them they turn into the same operation.