Long-delayed followup To: Roleplaying As Yourself

(Another simple intuition pump, this one especially useful for effective altruists who are struggling with wanting to do more or worrying they're not doing enough.)

Previously I wrote about a mental tool for prompting good consequentialist reasoning: ask yourself what a skilled alien roleplayer (here Gurgeh, from Player of Games) would do if they were controlling you, had to take only actions you could plausibly take, and scored points for achieving your goals.

This also serves as a standard for comparing your own actions, though as an aspiration rather than as an expectation.

The reason I mention this is that a good number of people in the rationality and effective altruism communities suffer from scrupulosity, the sense of guilt for not living up to an unattainable standard of conduct. And if we're going to speak to that sense, we need to start by getting the right standard of excellence to bargain with.

(If you're feeling scrupulous about altruism in particular, then you can imagine that Gurgeh gets points for achieving only your altruistic aims, though he's still constrained by your actual needs - he wouldn't steer you into a burnout, that wouldn't maximize his score.)

This standard is insanely daunting. Fortunately, it's not fair to ask you to meet it.

After all, you're not perfectly altruistic, and the other parts get to bargain too.

In Nobody Is Perfect, Everything Is Commensurable, Scott suggests that we deal with scrupulosity by letting ourselves be okay with the standard of giving 10% of our output to the most effective charitable causes. He runs into a bit of a problem when dealing with the fact that people are in different places in their careers (and that a tenth of one's income can be a large or small chunk of one's disposable income), and punts on the question a bit:

If you make $30,000 and you accept 10% as a good standard you want to live up to, you can either donate $3000 to charity, or participate in political protests until your number of lives or dollars or DALYs saved is equivalent to that.

I think this is the right place to introduce the alien gamer roleplaying your character. Are you building intangible expertise or career capital? Gurgeh notices the high payoff in later rounds of the game from these resources, and would be happy to forgo a little more short-term impact if your time/money/attention can translate into those resources more effectively. Are you torn between multiple opportunities to do good? Gurgeh checks once to see if there's a synergy between them (a way to get a higher combined total than he would optimizing for either alone), and if not, he ruthlessly picks the one that translates more efficiently into points, and doesn't feel bad about leaving behind a less efficient path.

So here's my suggestion:

Figure out the expected score that you'd actually expect Gurgeh to get in "The Altruistic You Experience", then consider ways to achieve at least one-tenth of that score, and let that be your target for moral achievement.

This is still a really high standard, one that few achieve! It almost surely isn't enough to take your default path in life while giving even 50% of your income to the best charity. It may require you to change your career, your social circles, your everyday habits. It may ask you to do lots of self-experimentation, with the corresponding expectation of frequent failure.

But it at least leaves more slack for your own flourishing than attempting to achieve the altruistic high score. It lets you seek a way of achieving excellence that satisfies your other wants and needs well. Maybe you don't take your altruistic best option if your second best is much more personally fulfilling; maybe you go ahead and splurge on something big every now and then. But you don't lose sight of your aspiration.

I just want to emphasize:

It's okay to give yourself more happiness and more leisure than you need in order to be effective. It's okay to care about your own well-being, and that of your family and friends, than that of strangers in far-off lands or times.

It's okay to be mostly selfish. Just be strategic about the altruistic part.

New to LessWrong?

New Comment
1 comment, sorted by Click to highlight new comments since: Today at 3:04 AM

Great post! It's like the "what if an alien took control of you" exercise but feels more playful and game-y. I started a Google doc to plan the month of April from Gurgeh's perspective.

See also: Outside.