This is a special post for quick takes by scott loop. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.

New to LessWrong?

1 comment, sorted by Click to highlight new comments since: Today at 6:04 AM

Which is more likely to result in a catastrophic outcome: a world war to reach AGI or AGI itself?

I believe a future event that results in a catastrophic loss of human life is much more likely to occur from a world war where factions race to ensure their side gets to AGI first than the likelihood of AGI itself causing a mass loss of human life. I think it’s safe to say the worst case AGI outcome would be much worse than this potential WWIII (no humans left vs some/few) but it seems there is very little discussion about preventing the war to reach AGI scenario compared to AGI safety. LW probably wouldn’t be the place for those discussions (maybe foreign policy think tanks, I’m not sure) but I was curious as to what other users here felt about which had higher odds of an existential threat. 

I’ve had this notion for a while but what spurned me to post this was the US govt blocking Nvidia from providing AI chips that could be used by the Chinese government and what that signals for US willingness to militarily defend Taiwan.