Some expert commentary here: https://www.sciencemag.org/news/2020/12/mutant-coronavirus-united-kingdom-sets-alarms-its-importance-remains-unclear
Noteworthy:
For example, moving from a 90% chance to a 95% chance of copying a skill correctly doubles the expected length of any given transmission chain, allowing much faster cultural accumulation. This suggests that there’s a naturally abrupt increase in the usefulness of culture
This makes sense when there's only one type of thing to teach / imitate. But some things are easier to teach and imitate than others (e. g. catching a fish vs. building a house). And while there may be an abrupt jump in the ability to teach or imitate each particular skill, this argument doesn't show that there will be a jump in the number of skills that can be taught /imitated. (Which is what matters)
Right, to be clear that's the sort of number I have in mind and wouldn't call far far lower.
the infection fatality rate is far, far lower [now]
Just registering that, based on my reading of people who study the IFR over time, this is a highly contentious claim especially in the US.
Are these known facts? If not, I think there's a paper in here.
But what if they reach AGI during their speed up?
I agree, but I think it's unlikely OpenAI will be the first to build AGI.
(Except maybe if it turns out AGI isn't economically viable).
OpenAI's work speeds up progress, but in a way that's likely smooth progress later on. If you spend as much compute as possible now, you reduce potential surprises in the future.
Last year it only took Google Brain half a year to make a Transformer 8x larger than GPT-2 (the T5). And they concluded that model size is a key component of progress. So I won't be surprised if they release something with a trillion parameters this year.
I'm not sure if a probability counts as continuous?
If so, what's the probability that this paper would get into Nature (main journal) if submitted? Or even better, how much more likely is it to get into The Lancet Public Health vs Nature? I can give context by PM. https://doi.org/10.1101/2020.05.28.20116129
No. Amodei led the GPT-3 project, he's clearly not opposed to scaling things. Idk why they're leaving but since they're all starting a new thing together, I presume that's the reason.