andrew sauer

Wiki Contributions


If this happened I would devote my life to the cause of starting a global thermonuclear war

Well there are all sorts of horrible things a slightly misaligned AI might do to you.

In general, if such an AI cares about your survival and not your consent to continue surviving, you no longer have any way out of whatever happens next. This is not an out there idea, as many people have values like this and even more people have values that might be like this if slightly misaligned.

An AI concerned only with your survival may decide to lobotomize you and keep you in a tank forever.

An AI concerned with the idea of punishment may decide to keep you alive so that it can punish you for real or perceived crimes. Given the number of people who support disproportionate retribution for certain types of crimes close to their heart, and the number of people who have been convinced (mostly by religion) that certain crimes (such as being a nonbeliever/the wrong kind of believer) deserve eternal punishment, I feel confident in saying that there are some truly horrifying scenarios here from AIs adjacent to human values.

An AI concerned with freedom for any class of people that does not include you (such as the upper class), may decide to keep you alive as a plaything for whatever whims those it cares about have.

I mean, you can also look at the kind of "EM society" that Robin Hanson thinks will happen, where everybody is uploaded and stern competition forces everyone to be maximally economically productive all the time. He seems to think it's a good thing, actually.

There are other concerns, like suffering subroutines and spreading of wild animal suffering across the cosmos, that are also quite likely in an AI takeoff scenario, and also quite awful, though they won't personally effect any currently living humans.

Well, given that death is one of the least bad options here, that is hardly reassuring...

Fuck, we're all going to die within 10 years aren't we?

Never, ever take anybody seriously who argues as if Nature is some sort of moral guide.

I had thought something similar when reading that book. The part about the "conditioners" is the oldest description of a singleton achieving value lock-in that I'm aware of.

If accepting this level of moral horror is truly required to save the human race, then I for one prefer paperclips. The status quo is unacceptable.

Perhaps we could upload humans and a few cute fluffy species humans care about, then euthanize everything that remains? That doesn't seem to add too much risk?

Just so long as you're okay with us being eaten by giant monsters that didn't do enough research into whether we were sentient.

I'm okay with that, said Slytherin. Is everyone else okay with that? (Internal mental nods.)

I'd bet quite a lot they're not actually okay with that, they just don't think it will happen to them...

the vigintillionth digit of pi

Load More