Why I don't think that the probability that AGI kills everyone is roughly 1 (but rather around 0.995).
Let A = Ability to refuse to learn a certain thing. B = Not wanting to be replaced by the next step in evolution. D = Ability to build technology, manipulate others etc etc, in a way that kills all humans. For example, humans seems to have A to some...
May 30, 2023-6