User Profile

star7
description39
message255

Recent Posts

Curated Posts
starCurated - Recent, high quality posts selected by the LessWrong moderation team.
rss_feed Create an RSS Feed
Frontpage Posts
Posts meeting our frontpage guidelines: • interesting, insightful, useful • aim to explain, not to persuade • avoid meta discussion • relevant to people whether or not they are involved with the LessWrong community.
(includes curated content and frontpage posts)
rss_feed Create an RSS Feed
Personal Blogposts
personPersonal blogposts by LessWrong users (as well as curated and frontpage).
rss_feed Create an RSS Feed

Survey: What's the most negative*plausible cryonics-works story that you know?

2y
2 min read
Show Highlightsubdirectory_arrow_left
74

Deliberate Grad School

3y
3 min read
Show Highlightsubdirectory_arrow_left
155

Willpower Depletion vs Willpower Distraction

4y
2 min read
Show Highlightsubdirectory_arrow_left
25

CFAR is looking for a videographer for next Wednesday

5y
1 min read
Show Highlightsubdirectory_arrow_left
1

Are coin flips quantum random to my conscious brain-parts?

5y
1 min read
Show Highlightsubdirectory_arrow_left
24

[LINK] General-audience documentary on cosmology, anthropics, and superintelligence

5y
1 min read
Show Highlightsubdirectory_arrow_left
0

The Relation Projection Fallacy and the purpose of life

5y
3 min read
Show Highlightsubdirectory_arrow_left
42

Narrative, self-image, and self-communication

5y
5 min read
Show Highlightsubdirectory_arrow_left
51

Credence calibration game FAQ

5y
1 min read
Show Highlightsubdirectory_arrow_left
56

Voting is like donating thousands of dollars to charity

5y
4 min read
Show Highlightsubdirectory_arrow_left
212

Recent Comments

I've been trying to get MIRI to switch to stop calling this blackmail (extortion for information) and start calling it extortion (because it's the definition of extortion). Can we use this opportunity to just make the switch?

I support this, whole-heartedly :) CFAR has already created a great deal of value without focusing specifically on AI x-risk, and I think it's high time to start trading the breadth of perspective CFAR has gained from being fairly generalist for some more direct impact on saving the world.

"Brier scoring" is not a very natural scoring rule (log scoring is better; Jonah and Eliezer already covered the main reasons, and it's what I used when designing the Credence Game for similar reasons). It also sets off a negative reaction in me when I see someone naming their world-changing strate...(read more)

This is a cryonics-fails story, not a cryonics-works-and-is-bad story.

Seems not much worse than actual-death, given that in this scenario you could still choose to actually-die if you didn't like your post-cryonics life.

Seems not much worse than actual-death, given that in this scenario you (or the person who replaces you) could still choose to actually-die if you didn't like your post-cryonics life.

Seems not much worse than actual-death, given that in this scenario you could still choose to actually-die if you didn't like your post-cryonics life.

This is an example where cryonics fails, and so not the kind of example I'm looking for in this thread. Sorry if that wasn't clear from the OP! I'm leaving this comment to hopefully prevent more such examples from distracting potential posters.

Hmm, this seems like it's not a cryonics-works-for-you scenario, and I did mean to exclude this type of example, though maybe not super clearly:

OP: There's a separate question of whether the outcome is positive enough to be worth the money, which I'd rather discuss in a different thread.

(2) A rich sadist finds it somehow legally or logistically easier to lay hands on the brains/minds of cryonics patients than of living people, and runs some virtual torture scenarios on me where I'm not allowed to die for thousands of subjective years or more.