Posts

Sorted by New

Wiki Contributions

Comments

axlrosen6mo3-2

Newbie here.

In the AI Timeline post, one person says it's likely that we will consume 1000x more energy in 8 years than we do today. (And another person says it's plausible.)

How would that happen? I guess the idea is: we discover over the next 3-5 years that plowing compute into AI is hugely beneficial, and so we then race to build hundreds or thousands of nuclear reactors?

axlrosen10mo10

Intuitively, I assume that LLMs trained on human data are unlikely to become much smarter than humans, right? Without some additional huge breakthrough, other than just being a language model?