ekaterimburgo735
2
2
ekaterimburgo735 has not written any posts yet.

ekaterimburgo735 has not written any posts yet.

What you're describing doesn't seem to be empathy in its fullest sense; it seems more like a projection of your own biases and anxieties onto others, disguised as empathy. Real empathy generally involves trying to understand what it's like to be the other person—including their fears, their desires, their biochemistry, their traumas, and their particular worldview—not simply imagining yourself in their place by applying your own frame of reference and then looking down on them for not sharing it.
Thank you for the follow-up post. The distinction between empathy and the attribution of "moral agency" is a very helpful update and greatly clarifies the crux of your original argument.
That said, your conflict doesn't seem to stem from simply attributing moral agency, but from an implicit assumption about the *utility function* that all moral agents *should* be optimizing.
If we model humans as self-optimizing agents, we can distinguish between:
1. Terminal Goals: The final, intrinsic objective. I would argue that for most humans, this approximates some form of "well-being" or "satisfaction" (happiness, to put it simply). It is the utility the system is trying to maximize.
2. Instrumental Goals: The subgoals an agent pursues because... (read more)