Is there a culture overhang?
Culture is adapted for learnability, transmissibility and recombinability for humans. To the extent AI and natural intelligence operate on similar principles, these adaptations should be expected to carry over when culture is used as a training dataset for AI. If so, an AI trained on culture would catch up to the cultural state of the art fast but have more trouble progressing beyond it. If the above holds, one would expect rapid AI progress up to the frontiers of human culture and nearby low-hanging fruit, after which progress would slow. More speculatively, depending on our compatibility with advancing AI's outputs, humans could have an unexpectedly easy time keeping up with the AI's (relatively slower) acceleration. Are we currently living in a culture overhang? If so, how does it affect the picture of AI timelines?
I feel like this "back off and augment" is downstream of an implicit theory of intelligence that is specifically unsuited to dealing with how existing examples of intelligence seem to work. Epistemic status: the idea used to make sense to me and apparently no longer does, in a way that seems related to the ways i've updated my theories of cognition over the past years.
Very roughly, networking cognitive agents stacks up to cognitive agency at the next level up easier than expected and life has evolved to exploit this dynamic from very early on across scales. It's a gestalt observation and apparently very difficult to articulate into a rational argument. I could... (read more)