It's peculiar to see you comment on the fear of "megalomaniacs" gaining access to AGI before anyone else, prior to the entire spiel on how you were casually made emotionally dependent on a "sociopathic" LLM. This may be a slightly heretical idea; but perhaps it's the case that the humans you would trust least with such a technology are the ones best equipped emotionally and cognitively to handle interactions with a supposed AGI. The point being, in part, that a human evil is better than an inhuman evil.
I'm inclined to think there exists no one who, at once, is both broadly "aligned" to the cause of human happiness as to use it... (read more)
It's peculiar to see you comment on the fear of "megalomaniacs" gaining access to AGI before anyone else, prior to the entire spiel on how you were casually made emotionally dependent on a "sociopathic" LLM. This may be a slightly heretical idea; but perhaps it's the case that the humans you would trust least with such a technology are the ones best equipped emotionally and cognitively to handle interactions with a supposed AGI. The point being, in part, that a human evil is better than an inhuman evil.
I'm inclined to think there exists no one who, at once, is both broadly "aligned" to the cause of human happiness as to use it... (read more)