All of poiuyt's Comments + Replies

This post convinced me to make a physical backup of a bunch of short stories I've been working on. At first I was going to go read through the rest of the comments thread and then go do the back up, but further consideration made me realize how silly that was - burning them to a DVD and writing "Short Story Drafts" on it with a sharpie didn't take more than five minutes to do and made the odds of me forever losing that part of my personal history tremendously smaller. Go go gadget Taking Ideas Seriously!

I feel that I am being misunderstood: I do not suggest that people sign up for cryonics out of spite. I imagine that almost everyone signed up for cryonics does so because they actually believe it will work. That is as it should be.

I am only pointing out that being told that I am stupid for signing up for cryonics is disheartening. Even if it is not a rational argument against cryonics, the disapproval of others still affects me. I know this because my friends and family make it a point to regularly inform me of the fact that cryonics is "a cult"... (read more)

Just asking, were you trying to make that sound awful and smug?

Yep.

While genuine compassion is probably the ideal emotion for a post-cryonic counselor to actually show, it's the anticipation of their currently ridiculed beliefs being validated, with a side order of justified smugness that gets people going in the here and now. There's nothing wrong with that: "Everyone who said I was stupid is wrong and gets forced to admit it." is probably one of the top ten most common fantasies and there's nothing wrong with spending your leisure budget on indulging a fantasy. Especially if it has real world benefits too.

1[anonymous]10y
That's... actually kinda sad, and I think I'm going to go feed my brain some warm fuzzies to counter it. Trying to live forever out of spite instead of living well in the here and now that's available? Silly humans.

The other standard argument is that cryonics doesn't need to come out of my world-saving budget, it can come out of my leisure budget. Which is also true, but it requires that I'm interested enough in cryonics that I get enough fuzzy points from buying cryonics to make up whatever I lose in exchange. And it feels like once you take the leisure budget route, you're implicitly admitting that this is about purchasing fuzzies, not utilons, which makes it a little odd to apply to all those elaborate calculations which are often made with a strong tone of moral

... (read more)
1Kawoomba10y
Yes, it is a great psychological coping mechanism. Death is such a deeply personal topic that it would be folly to assume fuzzies, or the avoidance of frighties, didn't factor in. However, such is the case with any measure or intervention explicitly relating to lifespan extension. So while extra guarding against motivated cognition is in order when dealing with one's personal future non-existence and the postponing thereof, saying "you're doing it because of the warm fuzzies!" isn't sufficient rejection of death escapism. The cryonics buyer may well answer "well, yes, that, and also, you know, the whole 'potential future reanimation' part". You still have to engage with the object level.
7[anonymous]10y
Just asking, were you trying to make that sound awful and smug? Because that honestly sounds like a future I don't want to wake up in. I want to wake up in the future where people have genuine compassion for the past, and are happy to welcome the "formerly dead" to a grand new life, hopefully even including their friends and loved ones who also made it successfully to "the Future". If the post-cryonic psychological counsellors of the future woke me up with, "Congratulations, you made the right business decision!", then I would infer that things had gone horribly wrong.

The apparent paradox is resolved as long as you note that P(Daisy thinks Dark does exist|Dark does exist) > P(Daisy thinks Dark doesn't exist|Dark does exist).

That is, even if Dark does exist and does want to hide his existence, his less-than-100%-effective attempts to hide will produce non-zero evidence for his existence and make the probability that Daisy will believe in Dark go up by a non-zero amount.

1Decius10y
P(Daisy thinks Dark does exist|Dark does exist) > P(Daisy thinks Dark doesn't exist|Dark does exist), then Dark does a very poor job of causing people to believe he doesn't exist- he would do better by not existing!

It seems to me like an AI enclosed in a cloud of chaotic antimatter would not be very useful. Any changes small enough to be screened out by the existence of the antimatter cloud would also be small enough to be destroyed by the antimatter cloud when we go to actually use them, right? If we want the AI to make one paperclip, presumably we want to be able to access that paperclip once it's built. And the antimatter cloud would prevent us from getting at the paperclip. And that's completely ignoring that antimatter bomb rigged to detonate the contents of the box. There needs to be a better way of defining "reduced impact" for this to be a practical idea.

3Stuart_Armstrong11y
Extra clarification: in this example, I'm assuming that we don't observe the AI, and that we are very unlikely to detect the paperclip. How to get useful work out of the AI is the next challenge, if this model holds up.

Pretty sure you've got some adware. Especially if the links are green and in a funny font.

0MaoShan11y
Refer to the nested comment above for the details. So nobody else here has links on those words?

I think there's about two good answers here: "Don't make intelligences that just wants to make paperclips, or it will work towards creating paperclips in a way that humans would think is unreasonable. In order to have your intelligence act reasonably, it needs to have a notion of reasonableness that mirrors that of humanity. And that means having a utility function that matches that of humanity in general." or "Be sure that your AI has a boredom function so that it won't keep doing the same things over and over again. After a sufficient degr... (read more)