Posts

Sorted by New

Wiki Contributions

Comments

  • Possible, seems unlikely. Requires two AI with different alignments, requires benevolent AI to respond to that sort of threat. Also falls under the first point.
  • Incredibly unlikely: an AI is not going to structure itself so it works to fulfill the inverse of its utility function as the result of a single bit flip.

This is a bit more broad than cryonics, but let's consider more specific possible causes of extreme torture. Here're the ones that occurred to me:

An AI runs or threatens to run torture simulations as a disincentive. This is entirely a manipulation technique and is instrumental to whatever goals it has, whether benevolent or neutral.

  • The programmers may work specifically to prevent this. However -- MIRI's current stance is that it is safer to let the AI design a utility function for itself. I think that this is the most likely and least worrisome way torture simulations could happen (small in scope and for the best).

An AI is programmed to be benevolent, but finds for some reason that suffering is terminally valuable, perhaps due to following a logical and unforeseen conclusion of a human-designed utility function.

  • I think this is a problematic scenario, and much worse than most AI design failures, because it ends with humans being tortured eternally, spending 3% of their existence in hell, or whatever, rather than just paperclips.

An AI is programmed to be malevolent.

  • This seems very very unlikely, given the amount of resources and people required to create an AI and the immense and obvious disutility in such a project.

An AI is programmed to obey someone who is malevolent.

  • Hopefully this will be prevented by, like, MIRI. Ethics boards and screening processes too.

Aliens run torture simulations of humans as punishment for defecting in an intergalactic acausal agreement.

  • This is the bloody AI's problem, not ours.

A country becomes a dystopia that tortures people.

  • Possible but very unlikely for political and economic reasons.

Thoughts? Please add.