Moby Dick is not a single physical manuscript somewhere.

"Moby Dick" can refer either to a specific object, or to a set. Your argument is that people are like a set, and Error's argument is that they are like an object (or a process, possibly; that's my own view). Conflating sets and objects assumes the conclusion.

I'm not conflating them, I'm distinguishing between them. It's because they're already conflated that we're having this problem. I'm explicitly saying that the substrate is not what it's important here.

But this works both ways: what is the non-question begging argument that observer slices can only be regarded as older versions of previous slices in the case that the latter and the former are both running on meat-based substrates? As far as I can see, you have to just presuppose that view to say that an upload's observer slice doesn't count as a legitimate... (read more)

Continuity in Uploading

by Error 1 min read17th Jan 201488 comments

7


I don't acknowledge an upload as "me" in any meaningful sense of the term; if I copied my brain to a computer and then my body was destroyed, I still think of that as death and would try to avoid it.

A thought struck me a few minutes ago that seems like it might get around that, though. Suppose that rather than copying my brain, I adjoined it to some external computer in a kind of reverse-Ebborian act; electrically connecting my synapses to a big block of computrons that I can consciously perform I/O to. Over the course of life and improved tech, that block expands until, as a percentage, most of my thought processes are going on in the machine-part of me. Eventually my meat brain dies -- but the silicon part of me lives on. I think I would probably still consider that "me" in a meaningful sense. Intuitively I feel like I should treat it as the equivalent of minor brain damage.

Obviously, one could shorten the period of dual-life arbitrarily and I can't point to a specific line where expanded-then-contracted-consciousness turns into copying-then-death. The line that immediately comes to mind is "whenever I start to feel like the technological expansion of my mind is no longer an external module, but the main component," but that feels like unjustified punting.

I'm curious what other people think, particularly those that share my position on destructive uploads.

---

Edited to add:

Solipsist asked me for the reasoning behind my position on destructive uploads, which led to this additional train of thought:

Compare a destructive upload to non-destructive. Copy my mind to a machine non-destructively, and I still identify with meat-me. You could let machine-me run for a day, or a week, or a year, and only then kill off meat-me. I don't like that option and would be confused by someone who did. Destructive uploads feel like the limit of that case, where the time interval approaches zero and I am killed and copied in the same moment. As with the case outlined above, I don't see a crossed line where it stops being death and starts being transition.

An expand-contract with interval zero is effectively a destructive upload. So is a copy-kill with interval zero. So the two appear to be mirror images, with a discontinuity at the limit. Approach destructive uploads from the copy-then-kill side, and it feels clearly like death. Approach them from the expand-then-contract side, and it feels like continuous identity. Yet at the limit between them they turn into the same operation.

7