and suffering-focused EAs do less stuff that tends to lead to the destruction of the world.

In support of this, my system 1 reports that if it sees more intelligent people taking S-risk seriously it is less likely to nuke the planet if it gets the chance. (I'm not sure I endorse nuking the planet, just reporting emotional reaction).

S-risks: Why they are the worst existential risks, and how to prevent them

by Kaj_Sotala 1 min read20th Jun 2017107 comments