large amounts of human utility (more than just pleasure) seem harder to achieve than large amounts of human disutility (for which pain is enough).

Carl gave a reason that future creatures, including potentially very human-like minds, might diverge from current humans in a way that makes hedonium much more efficient. If you assigned significant probability to that kind of scenario, it would quickly undermine your million-to-one ratio. Brian's post briefly explains why you shouldn't argue "If there is a 50% chance that x-risks are 2 million times wors... (read more)

S-risks: Why they are the worst existential risks, and how to prevent them

by Kaj_Sotala 1 min read20th Jun 2017107 comments

21