In Sam Altman's recent interview on Bill Gates' podcast, he said AI tools currently can speed up a programmer by about 3x. It's unclear whether he's referring to publicly available tools like copilot, or internal tools at OpenAI.

  1. Does anyone know whether he's referring to private or public tools? Is that 3x figure based on any public research?
  2. More speculatively, if programmers are sped up by 3x, how much are AI researchers sped up, particularly using whatever internal tools are currently available at OpenAI? The job involves more than just programming, but many non-programming tasks can be sped up as well.
  3. Even more speculatively, how much do you predict GPT-5 will speed up researchers at OpenAI?

New to LessWrong?

New Answer
New Comment

2 Answers sorted by

Tomás B.

Feb 21, 2024


GPT-5 with a context window that can fit entire code bases is going to be very scary. Particularly if you think, as I do, that agency is going to start to work soon. I really do think at least "weak recursive self improvement" of the form of automating AI research/training loops is on the table relatively soon.