Ruby | v1.27.0Sep 22nd 2020 | |||
JoshuaFox | v1.26.0Dec 3rd 2013 | (+32/-47) /* Humans as adaptation executors */ | ||
JoshuaFox | v1.25.0Aug 14th 2013 | (+29/-22) /* 2. Subgoal specified as supergoal */ | ||
Louie | v1.24.0May 22nd 2013 | fixed links | ||
JoshuaFox | v1.23.0Aug 31st 2012 | (+126/-20) /* 2. Subgoal specified as supergoal */ | ||
JoshuaFox | v1.22.0Aug 31st 2012 | (+63/-8) /* Humans as adaptation executors */ | ||
JoshuaFox | v1.21.0Aug 31st 2012 | (+16/-56) /* Humans as adaptation executors */ | ||
JoshuaFox | v1.20.0Aug 31st 2012 | (+313) /* 2. Subgoal specified as supergoal */ | ||
JoshuaFox | v1.19.0Aug 31st 2012 | /* 1. Supergoal replacement */ | ||
JoshuaFox | v1.18.0Aug 31st 2012 | (-13) /* Headline text */ |
To take an example from human organizations: If a software development manager gives a bonus to workers for finding and fixing bugs, she may find that quality and development engineers collaborate to generate as many easy-to-find-and-fix bugs as possible. In this case, they are correctly and flawlessly executing on the goals which the manager gave them, but her actual terminal value, high-quality software,software quality, is not being maximized.
The designer of an artificial general intelligence may give it a supergoal (terminal value) which appears to support the designer's own supergoals, but in fact issupports one of the designer's subgoals.subgoals, at the cost of some of the designer's other values. For example, if he thingsthe designer of an artificial general intelligence thinks that smiles represent the most worthwhile goal and specifies "maximize the number of smiles" as a goal for the AGI, it may tile the solar system with tiny smiley faces--not out of a desire to outwit the designer, but because it is precisely working towards the given goal, as specified.
If we consider evolution as an optimization process,process (though we should not, of course consider it an agent), a subgoal stomp has occurred.
Humans, forged by evolution, provide another example of subgoal stomp. Their terminal values, such as survival, health, social status, curiosity, etc., originally served instrumentally for the (implicit) goal of evolution, namely inclusive genetic fitness. Humans do not have inclusive genetic fitness as a goal. Humansgoal: We are adaptation executors rather than fitness maximizers (Tooby and Cosmides, 1992).
If we consider evolution as an optimization process (though of course, it not an agent),process, a subgoal stomp has occurred.
The designer of an artificial general intelligence may give it a supergoal (terminal value) which appears to support the designer's own supergoals, but in fact is one of the designer's subgoals. For example, if he things that smiles represent the most worthwhile goal and specifies "maximize the number of smiles" as a goal for the AGI, it may tile the solar system with tiny smiley faces--not out of a desire to outwit the designer, but because it is precisely working towards the given goal, as specified.
If we consider evolution as an optimization process (though
we shouldnot, ofcourse consider itcourse, as an agent), this represents a subgoalstomp has occurred.stomp.