Psy-Kosh, I realize the goal is to have a definition that's non-arbitrary. So it has to correlate with something else. And I don't see what we're trying to match it with, other than our own subjective sense of "a thing that it would be unethical to unintentionally create and destroy." Isn't this the same problem as the abortion debate? When does life begin? Well, what exactly is life in the first place? How do we separate persons from non-persons? Well, what's a person?
I think the problem to be solved lies not in this question, but in how the ethics of the asker are defined in the first place. And I just don't mean Eliezer, because this is clearly a larger-scale question. "How well will different possible boundary functions match the ethical standards of modern American society?" might be a good place to start.
Let me see if I've got this right. So we've got these points in some multi-dimensional space, perhaps dimensions like complexity, physicality, intelligence, similarity to existing humans, etc. And you're asking for a boundary function that defines some of these points as "persons," and some as "not persons." Where's the hard part? I can come up with any function I want. What is it that it's supposed to match that makes finding the right one so difficult?
"...if you would prefer not to become orgasmium, then why should you?"
I'd prefer not to become orgasmium, because I value my consciousness and humanity, my ability to think, decide, and interact. However, it's unclear to me what exactly preference is, other than the traversal of pathways we've constructed, whether we're aware of them or not, leading to pleasure, or away from pain. To drastically oversimplify, those values exist in me as a result of beliefs I've constructed, linking the lack of those things to an identity that I don't want, which in turn is eventually linked to an emotional state of sadness and loss that I'd like to avoid. There's also probably a link to a certain identity that I do want, which leads to a certain sense of pride and rightness, which leads me to a positive emotional state.
Eliezer, you said there was nothing higher to override your preference to increase intelligence gradually. But what about the preferences that led you to that one? What was it about the gradual increase of intelligence, and your beliefs about what that means, that compelled you to prefer it? Isn't that deeper motivation closer to your actual volition? How far down can we chase this? What is the terminal value of fun, if not orgasmium?
Or is "fun" in this context the pursuit specifically of those preferences that we're consciously aware of as goals?
Oops, I meant State B can lead to A or C.
g, I'm not sure how it all works out in terms of ψ, as the mathematics of multi-dimensional configuration spaces is way over my head. What I'm not clear on is, in the absence of t, why do we have to read the function from "left to right?" When you read in the other direction, State C can "lead to" A or B. Don't we need a variable to differentiate between the C that leads to A, and the C that leads to B, to as Eliezer put it, "keep things straight"?
"We don't need the t.
The r never repeats itself."
While this seems to be true given the expansion of the universe, is it strictly necessary? What if some value R does repeat, throwing the universe into an endless loop? At some point, the chains of r's leading up to R0 and R1 would differ; wouldn't we need another variable to encode that?