x
Concentration Risk Is Probably More Important Than Alignment Risk, And It's Heading to a Doom Scenario — LessWrong