Could reach the same point.
Said Eliezer agent is programmed genetically to value his own genes and those of humanity.
An artificial Elizer could reach the conclusion that humanity is worth keeping but is by no means obliged to come to that conclusion. On the contrary, genetics determines that at least some of us humans value the continued existence of humanity.
This is a cliche and may be false but it's assumed true:
"Power corrupts and absolute power corrupts absolutely".
I wouldn't want anybody to have absolute power not even myself, the only possible use of absolute power I would like to have would be to stop any evil person getting it.
To my mind evil = coercion and therefore any human who seeks any kind of coercion over others is evil.
My version of evil is the least evil I believe.
EDIT: Why did I get voted down for saying "power corrupts" - the corrollary of which is rejection of power is less corrupt whereas Eliezer gets voted up for saying exactly the same thing? Someone who voted me down should respond with their reasoning.
Now this is the $64 google-illion question!
I don't agree that the null hypothesis: take the ring and do nothing with it is evil.
My definition of evil is coercion leading to loss of resources up to and including loss of one's self.
Thus absolute evil is loss of one's self across humanity which includes as one use case humanity's extinction (but is not limited to humanity's extinction obviously because being converted into zimboes isn't technically extinction..)
Nobody can argue that the likes of Gaddafi exist in the human population: those who are interested in being the total boss of others (even thought they add no value to the lives of others) to the extent that they are willing to kill to maintain their boss position.
I would define these people as evil or with evil intent. I would thus state that I would like under no circumstances somebody like this to grab the ring of power and thus I would be compelled to grab it myself.
The conundrum is that I fit the definition of evil myself. Though I don't seek power to coerce as an end in itself I would like the power to defend myself against involuntary coercion.
So I see a Gaddafi equivalent go to grab the ring and I beat him to it.
What do I do next?
Well I can't honestly say that I have the right to kill the millions of Gaddafi equivalent but I think that on average they add a net negative to the utility of humanity.
I'm left, however, with the nagging suspicion that under certain circumstances, Gaddafi type figures might be beneficial to humanity as a whole. Consider: crowdsourcing the majority of political decisions would probably satisfy the average utility function of humanity. It's fair but not to everybody. We have almost such a system today (even though it's been usurped by corporations). But in times of crisis such as during war, it's more efficient to have rapid decisions made by a small group of "experts" combined with those who need to make ruthless decisions so we can't kill the Gaddafis.
What is therefore optimal in my opinion?
I reckon I'd take all the Gaddafis off planet and put them in simulations to be recalled only at times of need and leave sanitized nice people zimbo copies of them. Then I would destroy the ring of power and return to my previous life before I was tempted to torture those who have done me harm in the past.
Xannon decides how much Zaire gets.
Zaire decides how much Yancy gets.
Yancy decides how much Xannon gets.
If any is left over they go through the process again for the remainder ad infinitum until an approximation of all of the pie has been eaten.
Very Good response. I can't think of anything to disagree with and I don't think I have anything more to add to the discussion.
My apologies if you read anything adversarial into my message. My intention was to be pointed in my line of questioning but you responded admirably without evading any questions.
Thanks for the discussion.
Thanks for the suggestion. Yes I already have read it (steal beach). It was OK but didn't really touch much on our points of contention as such. In fact I'd say it steered clear from them since there wasn't really the concept of uploads etc. Interestingly, I haven't read anything that really examines closely whether the copied upload really is you. Anyways.
"I would also say that it doesn't matter that the vast majority of the cells comprising me twenty years ago are dead,
even though the cells currently comprising me aren't identical to the cells that comprised me then."
OK I have to say that now I've thought it through I think this is a straw man argument that "you're not the same as you were yesterday" used as a pretext for saying that you're exactly the same from one moment to the next. It is missing the point entirely.
Although you are legally the same person, it's true that you're not exactly physically the same person today as you were yesterday and it's also true that you have almost none of the original physical matter or cells in you today as you had when you were a child.
That this is true in no way negates the main point: human physical existence at any one point in time does
have continuity. I have some of the same cells I had up to about seven to ten years ago. I have some inert matter in me from the time I was born AND I have continual memories to a greater or lesser extent. This is directly analogous to my position that I posted before about a slow hybridizing transition to machine form before I had even clearly thought this out consciously.
Building a copy of yourself and then destroying the original has no continuity. It's directly analgous to budding
asexually a new copy of yourself and then imprinting it with your memories and is patently not the same concept as normal human existence. Not even close.
That you and some others might dismiss the differences is fine and if you hypothetically wanted to take the position that killing yourself so that a copy of your mind state could exist indefinitely then I have no problem with that, but it's patently not the same as the process you, I and everyone else goes through on a day to day basis. It's a new thing. (Although it's already been tried in nature as the asexual budding process of bacteria).
I would appreciate, however, that if that is a choice being offered to others, that it is clearly explained to them
what is happening. i.e. physical body death and a copy being resurrected, not that they themselves continue living, because they do not. Whether you consider it irrelevant is besides the point. Volition is very important, but I'll get to that later.
"I agree with you that if a person is perfectly duplicated and the original killed, then the original has been killed. (I would also say that the person was killed, which I think you would agree with.
I would also say that the person survived, which I think you would not agree with.)"
That's directly analogous to multi worlds interpretation of quantum physics which has multiple timelines.
You could argue from that perspective that death is irrelevant because in an infintude of possibilities
if one of your instances die then you go on existing.
Fine, but it's not me. I'm mortal and always will be even if some virtual copy of me might not be.
So you guessed correctly, unless we're using some different definition of "person" (which is likely I think)
then the person did not survive.
"I agree that volition is important for its own sake, but I don't understand what volition has to do with what we've thus far been discussing. If forcing the original to bud kills the original, then it does so whether the original wants to die or not. If it doesn't kill the original, then it doesn't, whether the original wants to die or not.
It might be valuable to respect people's volition, but if so, it's for some reason independent of their survival.
(For example, if they want to die, then respecting their volition is opposed to their survival.)"
Volition has everything to do with it.
While it's true that volition is independent of whether they have died or not (agreed),
the reason it's important is that some people will likely take your position to justify forced
destructive scanning at some point because it's "less wasteful of resources" or some other pretext.
It's also particularly important in the case of an AI over which humanity would have no control.
If the AI decides that uploads via destructive scanning are exactly the same thing as the original, and it needs the space for it's purposes then there is nothing to stop it from just going ahead unless volition is considered to be important.
Here's a question for you: Do you have a problem with involuntary forced destructive scanning in order to upload individuals into some other substrate (or even a copied clone)?
So here's a scenario for you given that you think information is the only important thing:
Do you consider a person who has lost much of their memory to be the same person?
What if such a person (who has lost much of their memory) then has a backed up copy of their memories from six months ago imprinted over top. Did they just die? What if it's someone else's memories: did they just die?
Here's yet another scenario. I wonder if you have though about this one:
Scan a person destructively (with their permission).
Keep their scan in storage on some static substrate. Then grow a perfectly identical clone of
them (using "identical" to mean functionally indentical because we can't get exactly identical as discussed before). Copy the contents of the mindstates into that clone.
Ask yourself this question: How many deaths have taken place here?
"Yes, I would say that if the daughter cell is identical to the parent cell, then it doesn't matter that the parent cell died at the instant of budding."
OK good to know. I'll have other questions but I need to mull it over.
"I would also say that it doesn't matter that the vast majority of the cells comprising me twenty years ago are dead, even though the cells currently comprising me aren't identical to the cells that comprised me then."
I agree with this but I don't think it supports your line of reasoning. I'll explain why after my meeting this afternoon.
"I agree with you that if a person is perfectly duplicated and the original killed, then the original has been killed. (I would also say that the person was killed, which I think you would agree with. I would also say that the person survived, which I think you would not agree with.)"
Interesting. I have a contrary line of argument which I'll explain this afternoon.
"I agree that volition is important for its own sake, but I don't understand what volition has to do with what we've thus far been discussing. If forcing the original to bud kills the original, then it does so whether the original wants to die or not. If it doesn't kill the original, then it doesn't, whether the original wants to die or not. It might be valuable to respect people's volition, but if so, it's for some reason independent of their survival. (For example, if they want to die, then respecting their volition is opposed to their survival.)"
Disagree. Again I'll explain why later.
"A question for you: if someone wants to stop existing, and they destructively scan themselves, am I violating their wishes if I construct a perfect duplicate from the scan? I assume your answer is "no," since the duplicate isn't them; they stopped existing just as they desired."
Maybe. If you have destructively scanned them then you have killed them so they now no longer exist so that part you have complied perfectly with their wishes from my point of view. But in order to then make a copy, have you asked their permission? Have they signed a contract saying they have given you the right to make copies? Do they even own this right to make copies?
I don't know.
What I can say is that our differences in opinion here would make a superb science fiction story.
Of course I would do it because it would be better than nothing. My memories would survive. But I would still be dead.
Here's a thought experiment for you to outline the difference (whether you think it makes sense from your position whether you only value the information or not):
Let's say you could slowly transfer a person into an upload by the following method:
You cut out a part of the brain. That part of the brain is now dead. You replace it with a new part, a silicon part (or some computational substrate) that can interface directly with the remaining neurons.
Am I dead? Yes but not all of me is and we're now left with a hybrid being. It's not completely me, but I've not yet been killed by the process and I get to continue to live and think thoughts (even though part of my thoughts are now happening inside something that isn't me).
Gradually over a process of time (let's say years rather than days or minutes or seconds) all of the parts of the brain are replaced.
At the end of it I'm still dead, but my memories live on. I did not survive but some part of the hybrid entity I became is alive and I got the chance to be part of that.
Now I know the position you'd take is that speeding that process up is mathematically equivalent.
It isn't from my perspective. I'm dead instantly and I don't get the chance to transition my existence in a meaningful way to me.
Sidetracking a little:
I suspect you were comparing your unknown quantity X to some kind of "soul". I don't believe in souls. I value being alive and having experiencing and being able to think. To me, dying and then being resurrected on the last day by some superbeing who has rebuilt my atoms using other atoms and then copies my information content into some kind of magical "spirit being" is exactly identical to deconstructing me - killing me - and making a copy even if I took the position that the reconstructed being on "the last day" was me. Which I don't. As soon as I die that's me gone, regardless of whether some superbeing reconstructs me later using the same or different atoms (if that were possible).
EDIT: Yes, you did understand though I can't personally say that I'm willing to come out and say definitively that the X is a red herring though it sounds like you are willing to do this.
I think it's an axiomatic difference Dave.
It appears from my side of the table that you're starting from the axiom that all that's important is information and that originality and/or physical existence including information means nothing.
And you're dismissing the quantum states as if they are irrelevant. They may be irrelevant but since there is some difference between the two copies below the macro scale (and the position is different and the atoms are different - though unidentifiably so other than saying that the count is 2x rather than x of atoms) then it's impossible to dismiss the question "Am I dying when I do this?" because your are making a lossy copy even from your standpoint. The only get-out clause is to say "it's a close enough copy because the quantum states and position are irrelevant because we can't measure the difference between atoms in two identical copies on the macro scale other than saying we've now got 2X the same atoms whereas before we had 1X).
It's exactly analogous to a bacteria budding. The original cell dies and close to an exact copy is budded off a.
If the daughter bacteria were an exact copy of the information content of the original bacteria then you'd have to say from your position that it's the same bacteria and the original is not dead right? Or maybe you'd say that it doesn't matter that the original died.
My response to that argument (if it were the line of reasoning you took - is it?) would be that "it matters volitionally - if the original didn't want to die and it was forced to bud then it's been killed).
"Again, just to be clear, what I'm trying to understand is what you value that I don't. If data at these high levels of granularity is what you value, then I understand your objection. Is it?"
OK I've mulled your question over and I think I have the subtley of what you are asking down as distinct from the slight variation I answered.
Since I value my own life I want to be sure that it's actually me that's alive if you plan to kill me. Because we're basically creating an additional copy really quickly and then disposing of the original I have a hard time believing that we're doing something equivalent to a single copy walking through a gate.
I don't believe that just the information by itself is enough to answer the question "Is it the original me?" in affirmative. And given that it's not even all of the information (though is all of the information on the macro scale) I know for a fact we're doing a lossy copy. The quantum states are possibly irrelevant on a macro scale for determing is (A == B) but since I knew from physics that they're not exactly equivalent once you go down to the quantum level I just can't buy into it though things would be murkier if the quantum states were provably identical.
Does that answer your question?