Subgoal Stomp

Ruby
JoshuaFox (+32/-47) /* Humans as adaptation executors */
JoshuaFox (+29/-22) /* 2. Subgoal specified as supergoal */
Louie fixed links
JoshuaFox (+126/-20) /* 2. Subgoal specified as supergoal */
JoshuaFox (+63/-8) /* Humans as adaptation executors */
JoshuaFox (+16/-56) /* Humans as adaptation executors */
JoshuaFox (+313) /* 2. Subgoal specified as supergoal */
JoshuaFox /* 1. Supergoal replacement */
JoshuaFox (-13) /* Headline text */

If we consider evolution as an optimization process (though we should not, of course consider itcourse, as an agent), this represents a subgoal stomp has occurred.stomp.

To take an example from human organizations: If a software development manager gives a bonus to workers for finding and fixing bugs, she may find that quality and development engineers collaborate to generate as many easy-to-find-and-fix bugs as possible. In this case, they are correctly and flawlessly executing on the goals which the manager gave them, but her actual terminal value, high-quality software,software quality, is not being maximized.

The designer of an artificial general intelligence may give it a supergoal (terminal value) which appears to support the designer's own supergoals, but in fact issupports one of the designer's subgoals.subgoals, at the cost of some of the designer's other values. For example, if he thingsthe designer of an artificial general intelligence thinks that smiles represent the most worthwhile goal and specifies "maximize the number of smiles" as a goal for the AGI, it may tile the solar system with tiny smiley faces--not out of a desire to outwit the designer, but because it is precisely working towards the given goal, as specified.

If we consider evolution as an optimization process,process (though we should not, of course consider it an agent), a subgoal stomp has occurred.

Humans, forged by evolution, provide another example of subgoal stomp. Their terminal values, such as survival, health, social status, curiosity, etc., originally served instrumentally for the (implicit) goal of evolution, namely inclusive genetic fitness. Humans do not have inclusive genetic fitness as a goal. Humansgoal: We are adaptation executors rather than fitness maximizers (Tooby and Cosmides, 1992).

If we consider evolution as an optimization process (though of course, it not an agent),process, a subgoal stomp has occurred.

The designer of an artificial general intelligence may give it a supergoal (terminal value) which appears to support the designer's own supergoals, but in fact is one of the designer's subgoals. For example, if he things that smiles represent the most worthwhile goal and specifies "maximize the number of smiles" as a goal for the AGI, it may tile the solar system with tiny smiley faces--not out of a desire to outwit the designer, but because it is precisely working towards the given goal, as specified.

Load More (10/28)