Today's post, Sympathetic Minds was originally published on 19 January 2009. A summary (taken from the LW wiki):

 

Mirror neurons are neurons that fire both when performing an action oneself, and watching someone else perform the same action - for example, a neuron that fires when you raise your hand or watch someone else raise theirs. We predictively model other minds by putting ourselves in their shoes, which is empathy. But some of our desire to help relatives and friends, or be concerned with the feelings of allies, is expressed as sympathy, feeling what (we believe) they feel. Like "boredom", the human form of sympathy would not be expected to arise in an arbitrary expected-utility-maximizing AI. Most such agents would regard any agents in its environment as a special case of complex systems to be modeled or optimized; it would not feel what they feel.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was In Praise of Boredom, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New to LessWrong?

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 5:40 AM

I'm not certain the post distinguishes 'empathy' and 'sympathy' the way I learned to at some point. In any case, I'm going to stick with my previous distinction since it has seemed to work well, unless the original post does make a different distinction and someone wants to argue its germane to understanding the thesis of the post.

The distinction I learned was empathy is when you actually feel what the other is feeling, by imagining yourself in their shoes (where the occurrence of actual, real feelings seems to be well explained by the mechanism of mirror neurons). Sympathy, in contrast, is when you cerebrally know that they must be feeling bad, and care about that, even though you don't have the negative feelings yourself.

I try to sympathize with my coworkers, rather than empathize, so that I'm not emotionally drained by talking with them but still friendly. My young daughter has pretty well-developed empathy (she doesn't like anyone to be upset) but very little sympathy because she doesn't have any good models yet of what people care about.

So empathy is hardware and sympathy is software?

If this definition of sympathy is adequate (cerebrally knowing how someone is affected, together with caring how they feel) than it seems an intelligence would be sympathetic whenever their self-interests are aligned with the other's well-being.

For example, would we say that humans are "sympathetic" to a cultivated crop, if we care about the health of the crop and would eliminate threats to that crop?

I don't think empathy needs to be emotionally draining. I observe that some people are highly empathetic yet not dragged down by it at all: once they finish assisting or meeting a suffering person, they have undaunted joy and energy. Since realizing this I've tried to emulate them, at least on the not-dragged-down side of the equation. I think I've made progress. I don't know how - my self-insight isn't great - but maybe just by seeing it as possible.

For what it's worth, your definitions of the terms match the ones I've seen.