Posts

Sorted by New

Wiki Contributions

Comments

Without enough focus or at least collaboration with shorttermism there will be an existential risk. Not from AI, from humanity. I would not be surprised if shit popped off long before AI even got a chance to kill us