Posts

Sorted by New

Wiki Contributions

Comments

Sorted by
R0k010

If the human race ends soon, there will be fewer people. Therefore, assign a lower prior to that. This cancels exactly the contribution from the doomsday argument.

R0k010

Essentially the only consistent low-level rebuttal to the doomsday argument is to use the self indication assumption (SIA).

What about rejecting the assumption that there will be finitely many humans? In the infinite case, the argument doesn't hold.

R0k0110

seems an arbitrary limit.

Your axiology is arbitrary. Everyone has arbitrary preferences, and arbitrary principles that generate preferences. You are arbitrary - you can either live with that or self-modify into something much less arbitrary like a fitness maximizer, and lose your humanity.

R0k0290

I think that the answer to this conundrum is to be found in Joshua Greene's dissertation. On page 202 he says:

"The mistake philosophers tend to make is in accepting rationalism proper, the view that our moral intuitions (assumed to be roughly correct) must be ultimately justified by some sort of rational theory that we’ve yet to discover ... a piece of moral theory with justificatory force and not a piece of psychological description concerning patterns in people’s emotional responses."

When Eliezer presents himself with this dilemma, the neural/hormonal processes in his mind that govern reward and decisionmaking fire "Yes!" on each of a series of decisions that end up, in aggregate, losing him $0.02 for no gain.

Perhaps this is surprising because he implicitly models his "moral intuition" as sampling true statements from some formal theory of Eliezer morality, which he must then reconstruct axiomatically.

But the neural/hormonal decisionmaking/reward processes in the mind are just little bits of biology that squirt hormones around and give us happy or sad feelings according to their own perfectly lawful operation. It is just that if you interpret those happy or sad feelings as implying some utility function, you end up deriving a contradiction. The fact that Eliezer knows about ordinal numbers and Knuth notation just makes it easier for his formidable analytic subsystem to more easily see those implicit contradictions.

What is the solution?

if you want to refuse the whole garden path you've got to refuse some particular step along the way

Then just arbitrarily refuse some step. Your brain encodes your own motivations in a way that doesn't make global sense, you will have to draw a lot of arbitrary lines.