What you're describing doesn't seem to be empathy in its fullest sense; it seems more like a projection of your own biases and anxieties onto others, disguised as empathy. Real empathy generally involves trying to understand what it's like to be the other person—including their fears, their desires, their biochemistry, their traumas, and their particular worldview—not simply imagining yourself in their place by applying your own frame of reference and then looking down on them for not sharing it.
Thank you for the follow-up post. The distinction between empathy and the attribution of "moral agency" is a very helpful update and greatly clarifies the crux of your original argument.
That said, your conflict doesn't seem to stem from simply attributing moral agency, but from an implicit assumption about the *utility function* that all moral agents *should* be optimizing.
If we model humans as self-optimizing agents, we can distinguish between:
1. Terminal Goals: The final, intrinsic objective. I would argue that for most humans, this approximates some form of "well-being" or "satisfaction" (happiness, to put it simply). It is the utility the system is trying to maximize.
2. Instrumental Goals: The subgoals an agent pursues because it believes they will help it achieve its terminal goal. This is where everything else comes in: strength (`Tsuyoku naritai`), competence, knowledge, wealth, social connection, validation, security, etc.
Your original post and this follow-up suggest that you have elevated a very specific instrumental goal—competence and personal growth—to the status of a universal terminal goal. The "disgust" or "disappointment" you feel when activating the "moral agency module" seems to be a reaction to agents who are not optimizing for *your* chosen instrumental goal.
The conclusion isn't that you should "lower your standards" for yourself. The conclusion could be that treating someone as a "moral agent" doesn't just mean demanding that they be responsible, but also recognizing that their responsibility is to their own, unique utility function, not to yours.