Thanks for links. My thought was that we may give higher negative utility to those x-risks which are able to become a-risks too, that is LHC and AI.

If you know Russian science fiction by Strugatsky, there is an idea in it of "Progressors" - the people who are implanted into other civilisations to help them develop quickly. At the end, the main character concluded that such actions violate value of any civilization to determine their own way and he returned to earth to search and stop possible alien progressors on here.

Oh, in those cases, the considerations I mentioned don't apply. But I still thought they were worth mentioning.

In Star Trek, the Federation has a "Prime Directive" against interfering with the development of alien civilizations.

0Lumifer3yIain Banks has similar themes in his books -- e.g. Inversions. And generally speaking, in the Culture universe, the Special Circumstances are a meddlesome bunch.

S-risks: Why they are the worst existential risks, and how to prevent them

by Kaj_Sotala 1 min read20th Jun 2017107 comments