I appreciate the depth of this discussion and willingness to share! I hope for more great content -
This podcast reinforced something for me. I used to think that containing superintelligence, or controlling it, was largely a joke, and that there was little value in dumping resources into such approaches. A few hunches that I assigned high probability informed this belief. First, the upper bound on intelligence is very, very high. A simple argument for this: the gap between frog and human intelligence is massive. But the energy and compute difference is relatively small. While jumping many OOM’s in energy and flops doesn’t guarantee a similar jump beyond human intelligence, it feels like... (read 384 more words →)
I appreciate the depth of this discussion and willingness to share! I hope for more great content -
This podcast reinforced something for me. I used to think that containing superintelligence, or controlling it, was largely a joke, and that there was little value in dumping resources into such approaches. A few hunches that I assigned high probability informed this belief. First, the upper bound on intelligence is very, very high. A simple argument for this: the gap between frog and human intelligence is massive. But the energy and compute difference is relatively small. While jumping many OOM’s in energy and flops doesn’t guarantee a similar jump beyond human intelligence, it feels like... (read 384 more words →)