When I was thinking about the concept of human brain emulation recently, a disturbing idea occurred to me. I have never seen anyone address it, so I suspect it is probably caused by my being deeply confused about either human neurobiology or computer science. I thought I'd ask about it in the hopes someone more informed would be able to explain it and the idea could stop bothering me:

Imagine that a brain emulation is in the process of being encoded into a storage medium. I don't think that it is relevant whether the copy is being made from an organic brain or an existing emulation. Presumably it takes some amount of time to finish copying all the information onto the storage media. If the information is about a person's values or personality, and it is only halfway copied, does that mean for a brief moment, before the copying process is complete, that the partial copy has very different personality or values from the original? Are the partially copied personality/values a different, simpler set of personality/values?

Presumably the copy is not conscious during the copying process, but I don't think that affects the question. When people are unconscious they still have a personality and values stored in their brain somewhere, they are just not active at the moment.

I find this idea disturbing because it implies that emulating any brain (and possibly copying de novo AI as well) would inevitably result in creating and destroying multiple different personality/value sets that might count as separate people in some way. No one has ever brought this up as an ethical issue about uploads as far as I know (although I have never read "Age of Em" by Robin Hanson), and my background is not tech or neuroscience, so there is probably something I am missing .

Some of my theories of things I am missing include:

  • I am naively viewing personality/values as being stored in the brain in a simple, list-like format. For example, I may be imagining that someone who likes reading and jogging is being copied, and there is a brief period where the copy is someone who only likes reading because "jogging" has not been copied yet. In reality personality/values are probably distributed in the brain in some complex way that would not work like that. If they are only partially copied the emulation would simply crash, or not have values (be unmotivated). It would not have simpler or different values.
  • I am imagining brains as having one section for personality and values, and one section for instrumental things like thought and judgement. In reality this is not the case, the parts are close together enough that a partially copied brain would not be a simpler version of the original. It would just crash/go insane.
  • It might be theoretically possible to copy a brain in such a way that you could create a simpler version of the original with a simpler personality and values that would have a separate, distinct identity. But you'd have to do it really precisely and intentionally to get it to work. Normally partial copies would just not work/crash, unless they are so close to being completely copied that they are more like the original with mild brain damage/amnesia rather than like a different person.
  • Brain emulation would not involve literal neurons, it would involve code that says where a virtual neuron is. So having some code missing during copying would not result in the neurons rerouting into new paths that would develop into a new personality the way they might in someone with brain damage. The missing code would just result in the emulator program crashing.

I'd appreciate if someone with more knowledge about this issue, or programming/neuroscience would be willing to explain where my thinking about it is going wrong. I am interested in explanations that are conditional on brain emulation working. Obviously if brain emulation doesn't work at all this issue won't arise. Thank you in advance, it is an issue that I continue to find disturbing.

New Answer
New Comment

1 Answers sorted by

Suppose that you are slowly walking into a literal physical tunnel. Almost all of your head is in the tunnel. If the part of your head that is not yet in the tunnel was destroyed, you would survive, but your personality would be different, from brain damage.

Now consider an uploaded mind being copied. The simulation process is paused, the data is copied byte for byte, and then two separate simulation processes start on separate computers.

If you cut the cable halfway through, and only look at what is on the second hard drive, then you get a partial, brain damaged mind. But at no point is that mind actually run. You are saying that if you ignore part of a mind, you see a brain damaged mind. In the case of an em being copied, that part might be on a different hard drive.

Of course, there are good moral reasons to make sure that the data cable isn't unplugged and the half-formed mind run.

I would say that I care about the simulation, not the data as such. In other words, you can encrypt the data, and decrypt it again all you want. You can duplicate the data, and then delete one copy, so long as you don't simulate the copy before deletion. You might disagree with this point of view but it is a consistent position.

Thanks for the reply. It sounds like maybe my mistake was assuming that unsimulated brain data was functionally and morally equivalent to an unconscious brain. From what you are saying it sounds like the data would need to be simulated even to generate unconsciousness.

3Donald Hobson4y
Yes, to get a state equivalent to sleeping, you are still simulating the neurons. You can get mind states that are ambiguous mixes of awake and asleep.
2Ghatanathoah4y
I am having trouble parsing this statement. Does it mean that when simulating a mind you could also simulate ambiguous awake/asleep in addition to simulating sleep and wakefulness? Or does it mean that a stored, unsimulated mind is ambiguously neither awake or asleep?
2Donald Hobson4y
There are states that existing humans sometimes experience, like sleepwalking, microsleeps ect that are ambiguous. Whether or not a digital mind is being simulated is a much crisper definition.
3 comments, sorted by Click to highlight new comments since: Today at 7:53 PM

I'd think that brain is more like a hologram. Copying a small part would result in a dimmer and less resolved, but still a complete image. That said, I also don't see an ethical issue in copying inactive brain "trait-by-trait".

That makes a lot of sense, thank you.

Trait by trait doesn't seem like a likely copy means.

One hemisphere, then the other, almost does though.

 

I find this idea disturbing because it implies that emulating any brain (and possibly copying de novo AI as well) would inevitably result in creating and destroying multiple different personality/value sets that might count as separate people in some way. No one has ever brought this up as an ethical issue about uploads as far as I know (although I have never read "Age of Em" by Robin Hanson), and my background is not tech or neuroscience, so there is probably something I am missing .

Suppose, as you were waking up, different parts of the brain would 'come online'. In theory, it could be the same thing. (With the 'incomplete parts' running even.)