Posts

Sorted by New

Wiki Contributions

Comments

In a somewhat ironic turn of events, Mongan-Rallis, H. (2018, April 19) is now unavailable at the link provided. Here is an archived version.

When you write "maximizing reward would likely involve seizing control", this, to me, implies seizing control of the reward provided. Yet, for this to be an argument for existential catastrophe, I think this needs to be seizing control of humanity.

Seizing control of the reward seems a lot easier than seizing control of humanity. For example, it could be achieved by controlling the data centre(s) where Alex runs or the corporation Magma.

Why do you expect seizing control of humanity? It seems much harder and more risky (in terms of being discovered or shut down), with no reason (that I can see) that it will increase the reward Alex receives.