First of all, I'm really glad we're having this conversation.

This question is the one philosophical issue that has been bugging me for several years. I read through your post and your comments and felt like someone was finally asking this question in a way that has a chance of being understood well enough to be resolved!

... then I began reading the replies, and it's a strange thing, the inferential distance is so great in some places that I also begin to lose the meaning of your original question, even though I have the very same question.

Taking a step bac... (read more)

My position would be that actions speak louder than thoughts. If you act as though you value your own happiness more than that of others... maybe you really do value your own happiness more than that of others? If you like doing certain things, maybe you value those things - I don't see anything irrational in that.

(It's perfectly normal to self-deceive to believe our values are more selfless than they actually are. I wouldn't feel guilty about it - similarly, if your actions are good it doesn't really matter whether you're doing them for the sake of other ... (read more)

1TheOtherDave7yCan you say more about why "it's just a fact that I care" is not satisfying? Because from my perspective that's the proper resolution... we value what we value, we don't value what we don't value, what more is there to say?

What makes us think _any_ of our terminal values aren't based on a misunderstanding of reality?

by bokov 1 min read25th Sep 201389 comments

17


Let's say Bob's terminal value is to travel back in time and ride a dinosaur.

It is instrumentally rational for Bob to study physics so he can learn how to build a time machine. As he learns more physics, Bob realizes that his terminal value is not only utterly impossible but meaningless. By definition, someone in Bob's past riding a dinosaur is not a future evolution of the present Bob.

There are a number of ways to create the subjective experience of having gone into the past and ridden a dinosaur. But to Bob, it's not the same because he wanted both the subjective experience and the knowledge that it corresponded to objective fact. Without the latter, he might as well have just watched a movie or played a video game.

So if we took the original, innocent-of-physics Bob and somehow calculated his coherent extrapolated volition, we would end up with a Bob who has given up on time travel. The original Bob would not want to be this Bob.

But, how do we know that _anything_ we value won't similarly dissolve under sufficiently thorough deconstruction? Let's suppose for a minute that all "human values" are dangling units; that everything we want is as possible and makes as much sense as wanting to hear the sound of blue or taste the flavor of a prime number. What is the rational course of action in such a situation?

PS: If your response resembles "keep attempting to XXX anyway", please explain what privileges XXX over any number of other alternatives other than your current preference. Are you using some kind of pre-commitment strategy to a subset of your current goals? Do you now wish you had used the same strategy to precommit to goals you had when you were a toddler?