Today's post, Purchase Fuzzies and Utilons Separately was originally published on 01 April 2009. A summary (taken from the LW wiki):

 

Wealthy philanthropists typically make the mistake of trying to purchase warm fuzzy feelings, status among friends, and actual utilitarian gains, simultaneously; this results in vague pushes along all three dimensions and a mediocre final result. It should be far more effective to spend some money/effort on buying altruistic fuzzies at maximum optimized efficiency (e.g. by helping people in person and seeing the results in person), buying status at maximum efficiency (e.g. by donating to something sexy that you can brag about, regardless of effectiveness), and spending most of your money on expected utilons (chosen through sheer cold-blooded shut-up-and-multiply calculation, without worrying about status or fuzzies).


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Helpless Individuals, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 8:03 AM

The utility generated by this post is probably huge! We have to take into account that humans are running on corrupted hardware, so that's also something to consider for those who instantly scorn others for splitting between charities. (Though of course the prior probability that a typical person splitting between charities is doing it in the efficient way described in the article is very low.)

I think it's worth noting that sometimes utilons and fuzzies do go together, and it helps to try to condition oneself in a way that this becomes common. The extent to which this is possible probably depends a lot on what kind of personality one starts out with. For instance, some people might get some meta-contrarian fuzzies out of things like not tipping waiters, or from knowing that one is "cold" enough to just multiply. Whereas others might have huge difficulties in overruling pre-existing emotional connections to certain charities / causes. Finally, something I also consider quite helpful and motivating is trying to visualize (some of -- scope insensitivity) the impact one is having by donating to the most cost-effective cause. Personally, I get a lot of fuzzies after (not during!) having watched horrible videos on wild animal suffering.

Somewhat unrelated: I find it interesting that Eliezer here uses the words altruism / altruistic and "other-directed benefits". Of course, this is what you would expect given the topic. However, after having read the metaethics sequence, I was puzzled why Eliezer almost never (iirc) refers to "other-regarding" anything. Instead I gathered the impression that his definition for "good" is simply about optimizing values he or humans in general (tend to) care about, and not about anything that is "other-benefitting" or even necessarily about others at all. But I might be totally wrong on this, not sure whether I understood everything correctly!

An interesting question related to this post is how should fuzzies count in a utility function.