Concentration Risk Is Probably More Important Than Alignment Risk, And It's Heading to a Doom Scenario
Summary: Power concentration via AI is likely a greater near-term risk than classical misalignment, primarily because (a) it converges under a wider range of AI development scenarios including ones where alignment succeeds, (b) the timelines are faster and more empirically grounded, and (c) the three mechanisms society would normally use...
Mar 201