Posts

Sorted by New

Wiki Contributions

Comments

Agreed, this seems to be the path that OpenAI is taking with plugins.

When you say, "take over", what do you specifically mean? In the context of a GPT descendent, would take over imply it's doing something beyond providing a text output for a given input? Like it's going out of its way to somehow minimize the cross-entropy loss with additional GPUs, etc.? 

"AI developers should not develop and deploy systems with a significant risk of killing everyone."

If you were looking at GPT-4, what criteria would you use to evaluate whether it had a significant risk of killing everyone?