Rank the following based on likelihood to nullify AI-risk
Rank the following based on the likelihood to nullify AI risk (whether by achieving alignment, stopping AI development, or another way) If you think you have better solutions to AI risk than I came up with, please add them to your ranking. [1] * Give EY[2] $10M * Give EY...
Sep 30, 20223