Posts

Sorted by New

Wiki Contributions

Comments

I would argue that such an AGI would be more likely to try to kill everyone in the short term. Humanity would pose a much more serious threat, and a sufficiently powerful AGI would only have to destroy society enough to stop us bothering it. Once that is done, there isn’t really any rush for it to ensure every last human is dead. I’m fact, the superintelligent AGI might want to keep us alive to study us.