x
Would a scope-insensitive AGI be less likely to incapacitate humanity? — LessWrong