So this post brought to you by Beren today is about how a lot of claims about within-paradigm algorithmic progress is actually mostly about just getting better data, leading to a Flynn effect, and the reason I'm mentioning this is because once we have to actually build new fabs and...
This linkpost is in part a response to @Raemon's comment about why the procedure Raemon did doesn't work in practice to deal with the selection effects I talked about in my last post. So in the last post, I was talking about a selection effect where believers of an argument...
Following in the tradition of @Algon, which linkposted an important thread from Daniel Eth about how AI companies are starting to seriously lobby, and have gotten early successes, I'll linkpost another thread from Daniel Eth, this time about how exponential increases are the default form of increase, assuming something's increasing...
Inspired by this thread, where a whole lot of discussion around what the term AGI actually means, and I'm starting to wonder if the term is at this point far too generally used and people not distinguishing similar outcomes precisely enough, now that we've made progress in AI. Now, @Thane...
There's an interestingly pernicious version of a selection effect that occurs in epistemology, where people can be led into false claims because when non-believers try to engage with arguments, the unconvinced will drop out at random steps, and past a few steps or so, the believers/evangelists who believe in all...
This post from Gwern tackles a question that I suspect could become very relevant for AI automating AI research (and jobs more generally), which is why don't current AIs produce frontier-expansion/insights semi-reliability beyond their training data, and what might be necessary for AI to create insights at least semi-reliably. My...
Jack Clark has a very important post on why it's so difficult to communicate with policymakers on AI risk, and the reason is that AI risk (and most discussions of AGI/ASI) is basically eschatological, in that it involves the end of the world/technology that looks like magic being developed by...