The flip side of this idea is "cosmic rescue missions" (term coined by David Pearce), which refers to the hypothetical scenario in which human civilization help to reduce the suffering of sentient extraterrestrials (in the original context, it referred to the use of technology to abolish suffering). Of course, this is more relevant for simple animal-like aliens and less so for advanced civilizations, which would presumably have already either implemented a similar technology or decided to reject such technology. Brian Tomasik argues that cosmic r... (read more)

Thanks for links. My thought was that we may give higher negative utility to those x-risks which are able to become a-risks too, that is LHC and AI.

If you know Russian science fiction by Strugatsky, there is an idea in it of "Progressors" - the people who are implanted into other civilisations to help them develop quickly. At the end, the main character concluded that such actions violate value of any civilization to determine their own way and he returned to earth to search and stop possible alien progressors on here.

S-risks: Why they are the worst existential risks, and how to prevent them

by Kaj_Sotala 1 min read20th Jun 2017107 comments

21