Posts

Sorted by New

Wiki Contributions

Comments

Without really making a point here, I think it's possible to make the definition of "selfishness" broad enough that really everything (a rational agent does) is selfish.

Like, you can also make the definition of "god" broad enough so that the probability of God existing gets arbitrarily close to 1 (for example, by allowing the gravitational force to be seen as a god). So, if we define "selfishness" as "maximizing your utility function" then every rational agent is selfish by the definition of "rational agent" (the utility function can value other people). Of course, as the text quoted above says, the word then has lost all its usefulness.

I think even an extreme example like: "What about an agent who is forced to do something that decreases their utility by threat of death?" falls under that broad definition because a rational agent will only go along with this if they expect death to be worse under their utility function.

Of course, humans are not really rational agents, so the original question of whether humans are always selfish is a bit harder to answer.