This is a reply to a comment by Yvain and everyone who might have misunderstood what problem I tried to highlight.
Here is the problem. You can't estimate the probability and magnitude of the advantage an AI will have if you are using something that is as vague as the concept of 'intelligence'.
Here is a case that bears some similarity and might shed light on what I am trying to explain:
At his recent keynote speech at the New York Television Festival, former Star Trek writer and creator of the re-imagined Battlestar Galactica Ron Moore revealed the secret formula to writing for Trek.
He described how the writers would just insert "tech" into the scripts whenever they needed to resolve a story or plot line, then they'd have consultants fill in the appropriate words (aka technobabble) later.
"It became the solution to so many plot lines and so many stories," Moore said. "It was so mechanical that we had science consultants who would just come up with the words for us and we'd just write 'tech' in the script. You know, Picard would say 'Commander La Forge, tech the tech to the warp drive.' I'm serious. If you look at those scripts, you'll see that."
Moore then went on to describe how a typical script might read before the science consultants did their thing:
La Forge: "Captain, the tech is overteching."
Picard: "Well, route the auxiliary tech to the tech, Mr. La Forge."
La Forge: "No, Captain. Captain, I've tried to tech the tech, and it won't work."
Picard: "Well, then we're doomed."
"And then Data pops up and says, 'Captain, there is a theory that if you tech the other tech ... '" Moore said. "It's a rhythm and it's a structure, and the words are meaningless. It's not about anything except just sort of going through this dance of how they tech their way out of it."
The use of 'intelligence' is as misleading and dishonest in evaluating risks from AI as the use of 'tech' in Star Trek.
It is true that 'intelligence', just as 'technology' has some explanatory power. Just like 'emergence' has some explanatory power. As in "the morality of an act is an emergent phenomena of a physical system: it refers to the physical relations among the components of that system". But it does not help to evaluate the morality of an act or in predicting if a given physical system will exhibit moral properties.