Terminal Value

Ruby
TheAncientGeek (+7/-7) /* Human terminal values */ ce
TheAncientGeek (+4/-4)
TheAncientGeek (+370) clarify the dintinction between the theoretical and factual aspectsof this subject
JoshuaFox (+183/-55) /* Terminal vs. instrumental values */
andre (+6/-5) /* Terminal vs. instrumental vales */
JoshuaFox (+11/-17) /* Human terminal values */
JoshuaFox (+9) /* Terminal vs. instrumental vales */
joaolkf (+6/-6)
JoshuaFox (+100/-100)

It is not known whether humans have terminal values that are clearly distinct from another set of instrumental values. Humans appear to adopt different values at different points in life. Nonetheless, if the theory of terminal values applies to Humans'humans', then their system of terminal values is quite complex. The values were forged by evolution in the ancestral environment to maximize inclusive genetic fitness. These values include survival, health, friendship, social status, love, joy, aesthetic pleasure, curiosity, and much more. Evolution's implicit goal is inclusive genetic fitness, but humans do not have inclusive genetic fitness as a goal. Rather, these values, which were instrumental to inclusive genetic fitness, have become humans' terminal values (an example of subgoal stomp).

In an artificial general intelligence with a utility or reward function, the terminal value is the maximization of that function. The concept is not usefully applicable to all ASs,Als, and it is not known how applicable it is to organic entities.

In an artificial general intelligence with a utility or reward function, the terminal value is the maximization of that function. The concept is not usefully applicable to all ASs, and it is not known how applicable it is to organic entities.

It is not known whether humans have terminal values that are clearly distinct from another set of instrumental values. Humans appear to adopt different values at different points in life. Nonetheless, if the theory of terminal values applies to Humans', then their system of terminal values is quite complex. The values were forged by evolution in the ancestral environment to maximize inclusive genetic fitness. These values include survival, health, friendship, social status, love, joy, aesthetic pleasure, curiosity, and much more. Evolution's implicit goal is inclusive genetic fitness, but humans do not have inclusive genetic fitness as a goal. Rather, these values, which were instrumental to inclusive genetic fitness, have become humans' terminal values (an example of subgoal stomp).

Some values may be called "terminal" merely in relation to an instrumental goal, yet themselves serve instrumentally towards a higher goal. However, in considering future artificial general intelligence, the phrase "terminal value" is generally used only for the top level of the goal hierarchy:hierarchy of the AGI itself: the true ultimate goals of a system, thosethe system; but excluding goals inside the AGI in service of other goals, and excluding the purpose of the AGI's makers, the goal for which do not serve any higher value.they built the system.

Humans cannot fully introspect their terminal values. Humans' terminal values are often mutually contradictory, inconsistent, and change over time.changeable.

Terminal values stand in contrast to instrumental values (also known as extrinsic values), which are means-to-an-end, mere tools in achieving terminal values. For example, if a given university student studies merely as a professional qualification, his terminal value is getting a job, while getting good grades is an instrument to that end. If a (simple) chess program tries to maximize piece value three turns into the future, that is an instrumental value to its implicit terminal value of winning the game.

A terminal value (also known as an intrinsic value)value) is an ultimate goal, an end-in-itself. The non-standard term "supergoal" is used for this concept in Eliezer Yudkowsky's earlier writings.

A terminal value (also known as an intrinsic value) is an ultimate goal, an end-in-itself. The non-standard term "supergoal" is used for this concept in Eliezer Yudkowsky's earlier writings.

In an artificial general intelligence with a utility or reward function, the terminal value is the maximization of that function. The non-standard term "supergoal" is used for this concept in Eliezer Yudkowsky's earlier writings.

Load More (10/45)