LESSWRONG
LW

249
newgalfix
0040
Message
Dialogue
Subscribe

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No posts to display.
No wikitag contributions to display.
OpenAI Shows Us The Money
newgalfix1mo10

As a naive follow-up: let's say GPT-6 could be trained in 3 months on a 3GW cluster. Could I instead train it in 9 months on a 1GW cluster?

Reply
OpenAI Shows Us The Money
newgalfix1mo10

Depending on OpenAI growth, this is more of a soft upper bound on what gets built.

I'm confused: the announcement indicates that the $400B has been committed, and is not dependent on OpenAI's growth (although perhaps you're implying that there's no way they actually spend the $400B unless OpenAI revenue continues to rapidly grow)?

Also, why would this $400B / 7GW be an upper bound? A recent WSJ article suggests they are planning to surpass that, although details are super light.

Reply
Jacob_Hilton's Shortform
newgalfix1mo10

Has Terence Tao publicly indicated that he's shifted his main focus to applying AI to math?

Reply
Models Don't "Get Reward"
newgalfix3y10

Thank you for writing this post - it really improved my understanding of how RL works.

Reply