Elitism and Effective Altruism

 

Many criticize Effective Altruists as elitist. While this criticism is vastly overblown, unfortunately, it does have some basis, not only from the outside looking in but also within the movement itself, including some explicitly arguing for elitism.

 

Within many EA circles, there are status games and competition around doing “as much as we can,” and in many cases, even judging and shaming, usually implicit and unintended but no less real, of those whom we might term softcore EAs. These are people who identify as EAs and donate money and time to effective charities, but otherwise lead regular lives, as opposed to devoting the brunt of their resources to advance human flourishing as do hardcore EAs. To be clear, there is no definitive and hard distinction between softcore and hardcore EAs, but this is a useful heuristic to employ, as long as we keep in mind that softcore and hardcore are more like poles on a spectrum rather than binary categories.

 

We should help softcore EAs feel proud of what they do, and beware implying that being softcore EA is somehow deficient or simply the start of an inevitable path to being a hardcore EA. This sort of mentality has caused people I know to feel guilty and ashamed, and led to some leaving the EA movement. Remember that we all suffer from survivorship bias based on seeing those who remained, and not those who left - I specifically talked to people who left, and tried to get their takes on why they did so.

 

I suggest we aim to respect people wherever they are on the softcore/hardcore EA spectrum. I propose that, from a consequentialist perspective, negative attitudes toward softcore EAs are counterproductive for doing the most good for the world.

 

Why We Need Softcore EAs

 

Even if the individual contributions of softcore EAs are much less than the contributions of individual hardcore EAs, it’s irrational and anti-consequentialist to fail to acknowledge and celebrate the contributions of softcore EAs, and yet that is the status quo for the EA movement. As in any movement, the majority of EAs are not deeply committed activists, but are normal people for whom EA is a valuable but not primary identity category.

 

All of us were softcore EAs once - if you are a hardcore EA now, envision yourself back in those shoes. How would you have liked to have been treated? Acknowledged and celebrated or pushed to do more and more and more? How many softcore EAs around us are suffering right now due to the pressure of expectations to ratchet up their contributions?

 

I get it. I myself am driven by powerful emotional urges to reduce human suffering and increase human flourishing. Besides my full-time job as a professor, which takes about ~40 hours per week, I’ve been working ~50-70 hours per week for the last year and a half as the leader of an EA and rationality-themed meta-charity. As all people do, when I don’t pay attention, I fall unthinkingly into the mind projection fallacy, assuming other people think like I do and have my values, as well as my capacity for productivity and impact. I have a knee-jerk pattern as part of my emotional self to identify with and give social status to fellow hardcore EAs, and consider us an in-group, above softcore EAs.

 

These are natural human tendencies, but destructive ones. From a consequentialist perspective, it weakens our movement and undermines our capacity to build a better world and decrease suffering for current and future humans and other species.

 

More softcore EAs are vital for the movement itself to succeed. Softcore EAs can help fill talent gaps and donating to effective direct-action charities, having a strong positive impact on the outside world. Within the movement, they support the hardcore EAs emotionally through giving them a sense of belonging, safety, security, and encouragement, which are key for motivation and mental and physical health. Softcore EAs also donate to and volunteer for EA-themed meta-charities, as well as providing advice and feedback, and serving as evangelists of the movement.

 

Moreover, softcore EAs remind hardcore EAs of the importance of self-care and taking time off for themselves. This is something we hardcore EAs must not ignore! I’m speaking from personal experience here.

 

Fermi Estimates of Hardcore and Softcore Contributions

 

If we add up the amount of resources contributed to the movement by softcore EAs, they will likely add up to substantially more than the resources contributed by hardcore EAs. For instance, the large majority of those who took the Giving What We Can and The Life You Can Save pledges are softcore EAs, and so are all the new entrants to the EA movement, by definition.

 

To attach some numbers to this claim, let’s do a Fermi Estimate that uses some educated guesses to get at the actual resources each group contributes. Say that for every 100 EAs, there are 5 hardcore EAs and 95 softcore EAs. We can describe softcore EAs as contributing anywhere from 1 to 10 percent of their resources to EA causes (this is the range from The Life You Can Save pledge to the Giving What We Can pledge), so let’s guesstimate around 5 percent. Hardcore EAs we can say give an average of 50% of their resources to the movement. Using the handy Guesstimate app, here is a link to a model that shows softcore EAs contribute 480 resources, and hardcore EAs contribute 250 resources per 100 EAs. Now, these are educated guesses, and you can use the model I put together to put in your own numbers for the number of hardcore and softcore EAs per 100 EAs, and also the percent of their resources contributed. In any case, you will find that softcore EAs contribute a substantial amount of resources.

 

We should also compare the giving of softcore EAs to the giving of members of the general public to get a better grasp on the benefits provided to improving the world by softcore EAs. Let’s say a typical member of the general public contributes 3.5% of her resources to charitable causes, by comparison to 5% for softcore EAs. Being generous, we can estimate that the giving of non-EAs is 100 times less effective than that of EAs. Thus, using the same handy app, here is a link to a model that demonstrates the impact of giving by a typical member of the general public, 3.5, vs. the impact of giving by a softcore EA, 500. Now, the impact of giving by a hardcore EA is going to be higher, of course, 5000 as opposed to 500, but again, we have to remember that there are many more softcore EAs who give resources. You’re welcome to plug in your own numbers to get estimates if you think my suggested figures don’t match your intuitions. Regardless, you can see the high-impact nature of how a typical softcore EA compares to a typical member of the general public.

 

Effective Altruism, Mental Health, and Burnout: A Personal Account

 

About two years ago, in February 2014, my wife and I co-founded our meta-charity. In the summer of that year, she suffered a nervous breakdown due to burnout over running the organization. I had to - or to be accurate, chose to - take over both of our roles in managing the nonprofit, assuming the full burden of leadership.

 

In the Fall of 2014, I myself started to develop a mental disorder from the strain of doing both my professor job and running the organization, while also taking care of my wife. It started with heightened anxiety, which I did not recognize as something abnormal at the time - after all, with the love of my life recovering very slowly from a nervous breakdown and me running the organization, anxiety seemed natural. I was flinching away from my problem, not willing to recognize it and pretending it was fine, until some volunteers at the meta-charity I run – most of them softcore EAs – pointed it out to me.

 

I started to pay more attention to this, especially as I began to experience fatigue spells and panic attacks. With the encouragement of these volunteers, who essentially pushed me to get professional help, I began to see a therapist and take medication, which I continue to do to this day. I scaled back on the time I put into the nonprofit, from 70 hours per week on average to 50 hours per week. Well, to be honest, I occasionally put in more than 50, as I’m very emotionally motivated to help the world, but I try to restrain myself. The softcore volunteers at the meta-charity I run know about my workaholism and the danger of burnout for me, and remind me to take care of myself. I also need to remind myself constantly that doing good for the world is a marathon and not a sprint, and that in the long run, I will do much more good by taking it easy on myself.

 

Celebrating Everyone

 

As a consequentialist, my analysis, along with my personal experience, convince me that the accomplishments of softcore EAs should be celebrated as well as those of hardcore EAs.

 

So what can we do? We should publicly showcase the importance of softcore EAs. For example, we can encourage publications of articles that give softcore EAs the recognition they deserve, as well as those who give a large portion of their earnings and time to charity. We can invite a softcore EA to speak about her/his experiences at the 2016 EA Global. We can publish interviews with softcore EAs. Now, I’m not suggesting we should make most speakers softcore EAs, or write most articles, or conduct most interviews with softcore EAs. Overall, my take is that it’s appropriate to celebrate individual EAs proportional to their labors, and as the numbers above show, hardcore EAs individually contribute quite a bit more than softcore EAs. Yet we as a movement need to go against the current norm of not celebrating softcore EAs, and these are just some specific steps that would help us achieve this goal.

 

Let’s celebrate all who engage in Effective Altruism. Everyone contributes in their own way. Everyone makes the world a better place.

 

Acknowledgments: For their feedback on draft versions of this post, I want to thank Linch (Linchuan) Zhang, Hunter Glenn, Denis Drescher, Kathy Forth, Scott Weathers, Jay Quigley, Chris Waterguy (Watkins), Ozzie Gooen, Will Kiely, and Jo Duyvestyn. I bear sole responsibility for any oversights and errors remaining in the post, of course.

 


A different version of this, without the Fermi estimates, was cross-posted on the EA Forum.

 

 

EDIT: added link to post explicitly arguing for EA elitism

New Comment
39 comments, sorted by Click to highlight new comments since:

FWIW, I'd describe myself as a softcore EA, and I haven't noticed any judging or shaming. I don't go out of my way to hang out with EAs, so that's probably part of it, but I have spent a while around them.

Before Ozy gave me the language of softcore/hardcore, I once said that I don't particularly think of myself as an EA, and got a look like "what are you talking about, you're totally an EA".

Glad you didn't get any of sense of status games, and remember, I said "in many EA circles." For a different perspective, see 1 and 2.

To be clear, I too have experienced the "not sure I'm doing enough" feeling - that's why I said I didn't particularly think of myself as an EA. But that's not because I've encountered judging or shaming from other people, even implicit and unintended. (And I'm not saying that doesn't happen, either.)

You should really direct less effort in your articles toward complimenting yourself. (For reference, suggesting that your tendency to work more than 50 hours a week on making the world a better place is part of a mental disorder may or may not be true, but still comes off to anybody with a modicum of social experience as bragging, and bragging in poor taste, to boot.)

Especially in the context of an article which is nominally about celebrating people, as it comes off, more than a little, as an expectation that people should celebrate you.

In this specific case, I used my story to make an example, I wouldn't be comfortable with sharing someone else's story about mental health and burnout, especially since this was already a story I shared publicly. But I hear you about the broad principle, and will try to do less of that - thanks!

I wouldn't judge anyone for donating more or less per se. It's just weird to hear people describe themselves as "effective altruists" if their current level is "actually, as a student I don't have any income, so I never really donated anything, but a few years later I am totally going to donate". It makes you wonder how large is exactly the set of effective altruists who have already donated at least one cent. Also, it cheapens the meaning of the words.

Perhaps mathematically speaking, the difference between donating 0 and donating 1 is much smaller than between donating 1 and donating 1000. But psychologically it is probably the other way round. The person who has already donated $1 to a GiveWell charity has already overcome the trivial inconveniences; all that is necessary is to repeat the same steps again with a different number. But the difference between 0 and 1 is the difference between "all talk, no action" and making the first step.

Hardcore EAs -- awesome; softcore EAs -- still very good; zerocore EAs -- please stop using the label.

I wonder what is the real distribution among people who publicly identify as EAs.

Maybe there could be some verification system, like a website that would publicly certify that you have donated at least $1 to an effective charity. (Or maybe multiple tiers, but this is already more or less what James_Miller suggested. Just saying that the minimal amount could be small, but definitely nonzero.)

I don't think the definition of effective altruism should be about donating money. There are many ways to be altruistic. Especially ways with more impact than donation one symbolic dollar.

I haven't met any zerocore EAs, but I trust your experience they exist. I tend to use the term "resources" instead of money, as some people have time/talent to give. If people have not contributed resources to EA causes, I agree they should not call themselves EAs.

I haven't met them either, but I remember reading about them in some articles people shared on facebook. The articles didn't make any judgement about this subset, they merely mentioned that some of the EAs don't donate anything, because they were students.

And my reaction was: this is so bad for PR. I mean, the whole message of effective altruism is kinda "instead of donating to cute puppies, we use the same money to heal children with malaria". And the obvious reply in such case would be: "well, at least I donated to the cute puppies, while you only participate at the conferences talking about healing children with malaria". A less charitable reply would point out that participating at the EA conferences also costs money.

But maybe in real life the subset is negligible. Internet often exaggerates things.

[-]gjm50

There's some discussion, including numbers and graphs, here. Fraction of self-reported EAs self-reporting as donating zero (this was in an LW survey) varies from ~13% to ~43% depending on age. (Younger people are more likely to report donating nothing, especially the under-20 category which is presumably full of impoverished students.)

Would be curious to see the difference between donations and volunteering - statistics show that young people tend to volunteer more. Do you know of any information on EA volunteering?

[-]gjm00

No. Sorry.

And my reaction was: this is so bad for PR. I mean, the whole message of effective altruism is kinda "instead of donating to cute puppies, we use the same money to heal children with malaria".

I don't think the whole message about effective altruism is about how to donate money. 80,000 hours for example recently wrote Why you should focus more on talent gaps, not funding gaps.

Exactly.

I know several students are working hard to gain the skills necessary to make big impacts, especially on XRisk reduction. They identify as EAs, and I think it would be the wrong move to tell them they're not "real EAs" because they aren't donating money to EA charities.

On one hand, I agree at least somewhat about the importance of preventing free riders. On the other hand, claiming that someone isn't a "real" effective altruist makes them believe they're less of an effective altruist, which makes them less committed to the cause. Conversely, every time a non-donating EA proclaims their EAness, it becomes a more integral part of their identity, raising their level of commitment to donating when they get income.

Wouldn't donating a symbolic dollar create even stronger psychological effect?

EA as a movement is about the idea that charity is not about engaging in symbolic actions but about actually having an effect.

Then people who don't donate at all shouldn't describe themselves as effective altruists.

They are aspiring effective altruists; they plan to donate in the future, but they may also change their minds later. Talk is cheap.

Only if they actually do it. It seems to follow that anyone willing to donate a symbolic dollar is already fairly likely to stay the course and therefore a low-priority target, whereas the people who wouldn't donate the symbolic dollar are also the easiest to alienate.

Everyone has done something for others in their lives, including students who haven't donated money to charities. And even most of those people have probably given money to other people at one point or another: a beggar, a friend, a child. So I don't think it's reasonable to talk about "zerocore" people.

EA could have different well-defined membership levels, such as gold, silver and bronze, all of which correspond to different amounts of money or labor regularly donated.

It feels like it would be counter-productive to assign people levels. There is a sort-of two-tier membership level already: The Life You Can Save Pledge (>1%) and the Giving What We Can Pledge (>10%). I encourage cheering people on (and engaging with them in a productive conversation) no matter where they currently are - even if they are just giving $20 to an obviously non-cost-effective charity.

What purpose do you think this arrangement might serve?

Firms that sell low marginal cost products face the problem of how to get the most revenue from customers when some customers are willing to pay a lot more than others. One solution is to just sell to customers willing to pay a lot, which seems analogous to EA not recognizing soft-core donors. Another solution is to sell different quality products to customers in the hopes that you get soft-core customers to pay a bit, but still can get lots of revenue from customers that place a greater monetary value on your product.

Interesting. So the implication would be to draw a clear difference between softcore and hardcore EAs, and give appropriate plaudits to each?

The purpose of market segmentation is to maximize revenue :-/

It seems like a valid application to me if one takes people to be paying some kind of cost in terms of their time and money in order to receive recognition for being a good person. You would like people to be able to spend a moderate amount of time and money to receive a moderate amount of recognition, and a large amount of time and money to receive a large amount of recognition. Having terms for different levels could help with this.

to receive recognition for being a good person

Let me rephrase it as 'I can buy the status of "a good person"'. Still fine?

http://lesswrong.com/lw/e95/the_noncentral_fallacy_the_worst_argument_in_the/

You're arguing by analogy: http://lesswrong.com/lw/vx/failure_by_analogy/ and trying to do guilt by association. It's an appeal to emotion, not reason.

Literally every other website on the internet will allow bad arguments like this... are you sure you don't want to hang out somewhere else? Seriously, give it some thought.

LOL. You have to try harder :-P

[-][anonymous]00

Seems like the same idea as your "becoming a superdonor" idea - you want to attach status to doing a lot of good.

Becoming a superdonor would be the same as becoming an EA. I'm not convinced that we should have levels of EA-ship or Superdonor-ship. I'm open to being convinced, of course, I just don't intuitively see the value of it.

...charisma is not a bloody dump stat, and you can't just substitute "Intelligence" in for it, no matter what feats you take.

Bloody hell. Great, somebody came up with an idea on how to make EA work. Only, it's an absolutely terrible idea, but since nobody has come up with anything else, you people are going for it.

This change doesn't pattern-match with status. It pattern-matches with cult, for the exact same reason Gleb's behavior pattern matches with sociopath. And the reason is that you are blindly tampering with the exact same mental levers cults (and sociopaths) tamper with, with a likewise imperfect understanding of how they work. We don't call them cults or sociopaths when they're good at what they do, you see, because we do not notice.

But even if there weren't pattern-matching which will completely nuke any credibility EA has as a result of this dropout-of-cult-leader-school approach, what would an ordinary person do, if confronted with a symbol that indicated that somebody has more status than them?

They'll buy into the first half-credible explanation on why that status symbol doesn't mean anything, and then they'll feel higher status than all the "dumb" people buying into that status symbol, because few people can tell the difference between cynicism and intelligence (disturbingly few people even realize the two concepts are distinct, actually). Think about all the people -already doing that- because some villagers apparently fish with their mosquito nets, thus disproving EA can be useful - clearly the argument is fallacious, but that doesn't matter, because they're rationalizing what they already want to believe, because EA is already threatening their sense of relative status.

Remember the HPMoR bit about Phoenixes as status symbols? Same principle.

Gleb Tsipursky's point is that this kind of stratification is what happens anyway. You want softcore EAs to get more recognition than that, because their contribution to EA as a social group is mostly hidden and not linked to visible effort within the EA movement.

[-]oge10

I think that using the term "effective altruist" causes a lot of problems with labeling (e.g. 'hardcore EA', 'softcore EA'). My thinking clarified when I began using only the term "effective altruism", and using it to stimulate asking, "how can I do the most good for each dollar of mine?"

http://effective-altruism.com/ea/9s/effective_altruism_is_a_question_not_an_ideology/

I hear you, and saw that article. It's a good piece, but we do have to deal with reality, though.

[-][anonymous]00

Thanks for this Gleb. Very well written, coherent and solid argument. I'm interested in getting a discussion going for the implications of a more inclusive EA community. My intuition is that incentivising light-EA's may be less-impactful over the long term due to 'evaporative cooling of group beliefs'.

[This comment is no longer endorsed by its author]Reply

Can you give a link to posts showing elitism in EA that weren't written in response to this one?