How likely do you think worse-than-extinction type fates to be?
The risks of unaligned-AGI are usually couched in terms of existential risk, whereby the outcome is explicitly or implicitly human extinction. However, given the nature of what an AGI would be able to do, it seems though there are plausible scenarios worse than human extinction. Just as on an individual...
Mar 24, 20235