Depending on OpenAI growth, this is more of a soft upper bound on what gets built.
I'm confused: the announcement indicates that the $400B has been committed, and is not dependent on OpenAI's growth (although perhaps you're implying that there's no way they actually spend the $400B unless OpenAI revenue continues to rapidly grow)?
Also, why would this $400B / 7GW be an upper bound? A recent WSJ article suggests they are planning to surpass that, although details are super light.
Has Terence Tao publicly indicated that he's shifted his main focus to applying AI to math?
Thank you for writing this post - it really improved my understanding of how RL works.
As a naive follow-up: let's say GPT-6 could be trained in 3 months on a 3GW cluster. Could I instead train it in 9 months on a 1GW cluster?