You are viewing revision 1.1.0, last edited by steven0461

If some entity pushes reality into some state -- across many contexts, not just by accident -- then you could say it prefers that state. Preferences are roughly equivalent to goals and values.

Preference orderings that obey some rationality axioms can be represented by a utility function.

See also