"... than average" is (almost) meaningless
Recently, I was talking to a friend who hadn't seen me in a while. They mentioned that my hair had grown noticeably, and then asked whether my hair grew fast or slow. I said that my hair growth was probably around average, but upon consideration, I realized that statement was so divorced from reality it would be a disservice to the notion of truth to even call it "false". Not only do I not know how I compare to the average, I have no idea what the average even is. It seems that people (myself included) make this kind of statement a lot, even when it's obviously not backed by concrete evidence. So what's going on here? I: Analysis When people (myself included) say that "I'm more/less [X] than average", where [X] is anything that isn't trivially measurable, they're not using 'average' to mean any statistical notion of an average. Rather, they're referring to their own mental generic-template-model of somebody who is "average at [X]". The first problem with this is that this template person isn't necessarily representative of any empirical measure of the average. It's a Frankenstein's monster cobbled together out of personal experience, half-remembered data, anecdotes from the internet, fictional 'evidence', and who knows what else. It doesn't have zero correlation to reality, but it's nowhere near 100%. The second problem is that your mental model is not my mental model. Your idea of average is not the same as my idea of it. Once again, it's not going to be maximally dissimilar, but I'd expect that there would be significant variance between people if you asked them to describe what their mental model of "average at [X]" actually entails. So when somebody says, for example, they're an above-average driver, you can't conclude anything about their driving skills until you also know what they think an average driver is. Yet when you hear them say that, you automatically match those words to your idea of an average driver, and develop a preconception of their drivin
The difference with normal software is that at least somebody understands every individual part, and if you collected all those somebodies and locked them in a room for a while they could write up a full explanation. Whereas with AI I think we're not even like 10% of the way to full understanding.
Also, if you're trying to align a superintelligence, you do have to get it right on the first try, otherwise it kills you with no counterplay.