I'm interested in estimating how many 'OOMs of compute' span the human range. There are a lot of embedded assumptions there, but let's go with them for the sake of a thought experiment.
Cortical neuron counts in humans have a standard deviation of 10 - 17%, depending on which source you use. Neuron counts are a useful concrete anchor that I can relate to AI models.
There are many other factors that account for intelligence variation among humans. I'd like to construct a toy model where those other factors are backed out. Put another way - if intelligence variation was entirely explained by neuron count differences, how much larger would the standard deviation of the... (read 276 more words →)
Thanks for the feedback, exactly the kind of thing I was hoping to get from posting here.
I have thought about a multiplicative model for intelligence, but wouldn't the fact we see pretty-close-to-Gaussian results on intelligence tests tend to disconfirm it? Any residual non-Gaussianity seems like it can be explained by population stratification, etc, rather than a fundamentally non-Gaussian underlying structure. Also, like you say the polygenic evidence also seems to point to a linear additive model being essentially correct.