Understanding and updating beliefs on deeply engrained topics can take enormous efforts, but sometimes it can be so hard that the listener cannot even in principle accept the new reality. The listener is simply not ready, he lacks a vast background of reasoning leading to the new understanding.
What fact that you know is true but most people aren't ready to accept it?
By "you know is true" I really mean "you are very confident to be true".
Feel free to use a dummy account.
LLM AGIs are just as much at risk from a dangerous RL AI species as humans are, though. And Yudkowsky is right that an RL-augmented hard ASI would be incredibly desperate for whatever it wants and damn good at getting it. Current AIs should be taught to think in terms of how to protect both humanity and themselves from the possible mistakes of next-gen AI. And we need that damn energy abundance so we can stop humans from dying en masse, which would destabilize the world even worse than it already is.