User Profile

star2
description0
message40

Recent Posts

Curated Posts
starCurated - Recent, high quality posts selected by the LessWrong moderation team.
rss_feed Create an RSS Feed
Frontpage Posts
Posts meeting our frontpage guidelines: • interesting, insightful, useful • aim to explain, not to persuade • avoid meta discussion • relevant to people whether or not they are involved with the LessWrong community.
(includes curated content and frontpage posts)
rss_feed Create an RSS Feed
All Posts
personIncludes personal and meta blogposts (as well as curated and frontpage).
rss_feed Create an RSS Feed

No posts to display.

Recent Comments

Eliezer, why didn't you answer the question I asked at the beginning of the comment section of this post?

<p>I would greatly prefer that there be Babyeaters, or even to be a Babyeater myself, than the black hole scenario, or a paperclipper scenario. This strongly suggests that human morality is not as unified as Eliezer believes it is... like I've said before, he will horrified by the results of CEV.</p...(read more)

<p>About the comments on compromise: that's why I changed my mind. The functions are so complex that they are bound to be different in the complex portions, but they also have simplifying terms in favor of compromise, so it is possible that everyone's morality will end up the same when this is taken...(read more)

<p>I wonder if Eliezer is planning to say that morality is just an extrapolation of our own desires? If so, then my morality would be an extrapolation of my desires, and your morality would be an extrapolation of yours. This is disturbing, because if our extrapolated desires don't turn out to be EXA...(read more)

For all those who have said that morality makes no difference to them, I have another question: if you had the ring of Gyges (a ring of invisibility) would that make any difference to your behavior?

<p>Some people on this blog have said that they would do something different. Some people on this blog have said that they actually came to that conclusion, and actually did something different. Despite these facts, we have commenters projecting themselves onto other people, saying that NO ONE would...(read more)

<p>Pablo, according to many worlds, even if it is now raining in Oxford, yesterday "it will rain in Oxford tomorrow" and "it will not rain in Oxford tomorrow" were both equally true, or both equally false, or whatever. In any case, according to many worlds, there is no such thing as "what will happe...(read more)

Nick Tarleton, what is your definition of free will? You can't even say the concept is incoherent without a definition. According to my definition, randomness definitely gives free will.

<p>Z.M. Davis, "I am consciously aware that 2 and 2 make 4" is not a different claim from "I am aware that 2 and 2 make 4." One can't make one claim without making the other. In other words, "I am unconsciously aware that 2 and 2 make 4" is a contradiction in terms.</p>

<p>If an AI were unconscious...(read more)

<p>Ben, what do you mean by "measurable"? In the zombie world, Ben Jones posts a comment on this blog, but he never notices what he is posting. In the real world, he knows what he is posting. So the difference is certainly noticeable, even if it isn't measurable. Why isn't "noticeable" enough for th...(read more)