x
AGI in a vulnerable world — LessWrong