User Profile


Recent Posts

Curated Posts
starCurated - Recent, high quality posts selected by the LessWrong moderation team.
rss_feed Create an RSS Feed
Frontpage Posts
Posts meeting our frontpage guidelines: • interesting, insightful, useful • aim to explain, not to persuade • avoid meta discussion • relevant to people whether or not they are involved with the LessWrong community.
(includes curated content and frontpage posts)
rss_feed Create an RSS Feed
All Posts
personIncludes personal and meta blogposts (as well as curated and frontpage).
rss_feed Create an RSS Feed

No posts to display.

Recent Comments

I simplify here because a lot of people think I will have contradictory expectations for a more complex event.

But I think you're being even more picky here. Do I -expect- that increasing the amount of gold in the world will slightly affect the market value? Yes. But I haven't wished anything re...(read more)

The genie is, after all, all-powerful, so there are any number of subtle changes it could make that you didn't specify against that would immediately make you, or someone else, wish for the world to be destroyed. If that's the genie's goal, you have no chance. Heck, if it can choose it's form it c...(read more)

A wish is a pretty constrained thing, for some wishes.

If I wish for a pile of gold, my expectations probably constrain lots of externalities like 'Nobody is hurt acquiring the gold, it isn't taken from somewhere else, it is simply generated and deposited at my feet, but not, like, crushing me, or ...(read more)

I just take this as evidence that I -can't- beat the genie, and don't attempt any more wishes.

Whereas, if it's something simple then I have pretty strong evidence that the genie is -trying- to meet my wishes, that it's a benevolent genie.

Wish 1: "I wish for a paper containing the exact wording of a wish that, when spoken to you, would meet all my expectations for a wish granting X." For any value of X.

Wish 2: Profit.

Three wishes is overkill.

That's hardly a critique of the trolley problem. Special relativity itself stipulates that it doesn't apply to faster-than-light movement, but a moral theory can't say "certain unlikely or confusing situations don't count". The whole point of a moral theory is to answer those cases where intuition...(read more)

I can attest that I had those exact reactions on reading those sections of the article. And in general I am more impressed by someone who graduated quickly than one who took longer than average, and by someone who wrote a book rather than one who hasn't. "But what if that's not the case?" is hardl...(read more)

I don't see how this is admirable at all. This is coercion.

If I work for a charitable organization, and my primary goal is to gain status and present an image as a charitable person, then efforts by you to change my mind are adversarial. Human minds are notoriously malleable, so it's likely that...(read more)

It's excessive to claim that the hard work, introspection, and personal -change- (the hardest part) required to align your actions with a given goal are equivalent in difficulty or utility to just taking a pill.

Even if self-help techniques consistently worked, you'd still have to compare the oppor...(read more)

You are quite right. My scores correlate much better now; I retract my confusion.