Wiki Contributions

Comments

To clarify, I also don't think EA has much potential as a social movement even if marketed properly. Specific EA beliefs are much more spreadable memes down the line IMO.

Yup - although in the case of EA, that's still likely to be a very slow process. This isn't the sort of thing that can go viral. It takes months or years of cultivating before someone transfers from complete outsider to core member.

If you're talking about recruiting new EAs, it sounds like you mean people that agree enough with the entire meme set that they identify as EAs. Have there been any polls on what percentage of self-identifying EAs hold which beliefs? It seems like the type of low-hanging fruit .impact could pick off. That poll would give you an idea of how common it is for EAs to believe only small portions of the meme set. I expect that people agree with the majority of the meme set before identifying as EA. I believe a lot more than most people and I only borderline identify as EA.

So you expect movement building / outreach to be a lot less successful than community building ("inreach", if you will)?

Yes, especially if the same strategies are expected to accomplish both. They're two very different tasks.

Some of this comes down to what counts as an "EA". What kind of conversion do we need to do, and how much? I also think I'll be pretty unsuccessful at getting new core EAs, but what can I get? How hard is it? These are things I'd like to know, and things I believe would be valuable to know.

I think you can convince people to give more of their money away, you can convince people to take the effectiveness of the charity into account, you can convince people to care more about animals or to stop eating meat, and possibly that there are technological risks that are greater than climate change and nuclear war. I don't think you'll convince the same person of all of these things. Rather they'll be individuals that are on board with specific parts and that may or may not identify with EA.

I'm saying it helps with retention but barely at all with recruitment - and that it may even get in the way of recruitment of casual EAs. I don't think Skillshare favours will make people want to self-identify as EA. Only a minority of people even require the sorts of favours being offered.

"A stronger community for the effective altruist movement should better encourage existing EAs to contribute more and better attract new people to consider becoming EA. By building the EA Community, we hope to indirectly improve recruitment and retention in the effective altruist movement, which in turn indirectly results in more total altruistic effort, in turn resulting in more reduced suffering and increased happiness."

I'm going to predict that .impact struggles to meet this objective.

I think you're taking a naive view of how movement building works.

I think you need to see the distinction between retaining and recruiting members as analogous to the tension between a core and casual fan base. In order to recruit new EAs, your pitch will almost definitely have to downplay certain areas that many core EAs spend lots of time thinking about. That way, you'll bring in a lot of new people that, for example, buy the argument that you should donate to the charity that provides the most bang for your buck and yet still, for example, have zero interest in AI or animals. If you refuse to alienate core EA member values in order to get more casual EAs (e.g. people that donate to GiveWell's top charities and give a bit more than average) then, well, that's admirable, I guess, but you're movement building won't go anywhere. There's a reason why for-profit organizations do this - it actually works.

The amount of people that share most EA values is going to remain low for a very long time. Increasing that number wouldn't involve "recruitment" as much as it would involve full-on conversion. As long as your goal is to increase that number, you're going to see very low recruitment rates. Most people aren't on the market shopping for new worldviews - but individual new beliefs or values, maybe. And if you won't agree with a worldview, you aren't going to join the community just because it's active.

If you want more "total altruistic effort," go convince people to show more altruistic effort. Trying to movement build a group as complex and alienating as EA by strengthening its internal ties will dissuade most outsiders from wanting to join you. Pre-existing communities can be scary things to self-identify with.

You know how some parents make their kids try cigarettes at a young age so that they'll hate it and then not want to smoke when they're older? Well, a website like Brian Tomasik's is like that for most potential EAs. Way too much, too soon.

Cool. Will there be a lot of overlap with Intuition Pumps and Other Tools for Thinking? Based on your description, it sounds like Dennett just wrote this book for you.

You're right. I think scientific thinkers can sometimes misinterpret skepticism as meaning that nothing short of peer-reviewed, well-executed experiments can be considered evidence. I think sometimes anecdotal evidence is worth taking seriously. It isn't the best kind of evidence, but it falls above 0 on the continuum.

The good news is that our higher cognitive abilities also allow us to overcome depression in many situations. In Stumbling on Happiness, Daniel Gilbert explains how useful it is that we can rationalize away bad events in our lives (such as rejection). This capability, which Gilbert refers to as our psychological immune system, explains why people are able to bounce back from negative events much more quickly than they expect to.

mbitton2410y250

I think speaking in terms of probabilities also clears up a lot of epistemological confusion. "Magical" thinkers tend to believe that a lack of absolute certainty is more or less equivalent to total uncertainty (I know I did). At the same time, they'll understand that a 50% chance is not a 99% chance even though neither of them is 100% certain. It might also be helpful to point out all the things they are intuitively very certain of (that the sun will rise, that the floor will not cave in, that the carrot they put in their mouth will taste like carrots always do) but don't have absolute certainty of. I think it's important to make clear that you agree with them that we don't have absolute certainty of anything and instead shift the focus toward whether absolute certainty is really necessary in order to make decisions or claim that we "know" things.

Load More