I’m a big life extension supporter but being unable to choose to die ever is a literal hell. As dark as it is, if these scenarios are likely, it seems the rational thing to do is die before AGI comes.
Killing all of humanity is bad enough, but how concerned should we be about even worse scenarios?
There is discussion of some possibilities at https://www.reddit.com/r/SufferingRisk/wiki/intro/. I'd like to see more talk about these issues.
Well ... The possibility of the scenario where Natural General Intelligence does these two things is approximately 100%.
One such scenario is that the world will end as a semi stable bipolar world. There will be two AIs, and one of them will be friendly. This will create an initiative for another AI to torture people to blackmail first AI. God bless us to escape this hell.