This is a special post for quick takes by another-anon-do-gooder. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
“Our futures have never been more correlated”, wrote a Nick Cammarata tweet. I’m wondering how true that is for people who think we’re in an unstoppable arms race towards world-ending ASI. If these individuals are selfish enough, wouldn’t they want to kill nearly all of humanity from a bunker prior to takeoff, so as to stop takeoff? I’m thinking of a window when AI is powerful enough for catastrophic misuse but not yet self-recursively improving.
I'd appreciate if someone could tell me what I'm missing. I don't mind the downvotes, I'd just like if someone who disagrees could shed some light. To be clear, I'm certainly not advocating for anyone to take the described actions, but not everyone is ethical