EA Outreach is the organization behind the EA Global conference, the EffectiveAltruism.org website, and various other projects related to the development of the Effective Altruism community. This post is going to explain a bit more what we are working on, and why we think that donating to us might currently be the best use of marginal donations in the EA Movement. This is both part of our annual fundraiser, but also a general attempt at trying to communicate better what we have been working on, and what we are going to be working on in the coming months.

What is EAO's core vision?

Ironically, our focus is not what our name might naively suggest. Though EAO had been founded with the goal of rapidly growing the EA community, we have since realized that pure growth is not the best thing to focus on. Instead our focus is much better summarized with the statement:

Understand the EA community, and help guide it towards the worlds in which it can have the most impact

This means concretely, that EAO is trying to do two things:

  1. Do research on the composition, structure and dynamics of the EA community
  2. Build projects that steer the EA community towards a better future (using the previously acquired knowledge)

Keep in mind that these are our highest level goals, and that in our every day work we are working on projects that are much more concrete than this.

So what is EAO actually doing every day?

On any given day, there is a selection of concrete projects that we are working on. Most of our projects generate direct value, while also helping us gather information about the structure and function of EA at large. Since I don't want to throw a giant wall of text at you, here is a nice diagram that tries to summarize the different projects we were working on in 2015. Feel free to ask more detailed question in the comments or read our full plan for 2016.

EA Outreach Projects 2015

The important thing to keep in mind in all of this is that EAO has existed for less than a year, and has consisted of less than 3 people for most of that year. The above is quite impressive for a group of that small size, and I think we basically managed to keep up that level of productivity per individual as we expanded our team (to a total team size of 6 people).

That said, though I think the above projects are somewhat indicative of what EAO is doing on any given day, I think they are also slightly misleading. The projects we tackled during 2015 were immediately valuable, but weren't really driven by much of a larger vision, or a detailed model of where we want EA to go. This has changed in the last few months, and during 2016 we are going to be running projects that are both significantly larger in scope, and have significantly more sophisticated models behind them. Those projects, and why we think they are valuable, is what I am going to be spending the rest of this article on (and something I am much more excited by than talking about the things we already did).

What makes EAO a valuable target for donations?

Jonathon Smith, a donor in our most recent fundraiser, summarized his perspective on EAO as follows:

"A quick note on what encouraged me to donate to EAO.

I navigate robotic spacecraft to destinations in deep space at JPL. If you're trying to get somewhere like Jupiter or Saturn, the most important course corrections you can make are right after launch. We always have a crack team of analysts closely monitoring a spacecraft just after it leaves Earth, because the energy required to change the spacecraft's heading grows exponentially with time; point in the wrong direction too long and the mission is lost.

EA is moving really, really fast, and small adjustments to its development now are likely to have huge consequences down the road. With EAO, we have a team of talented people focused on nothing but making sure it's heading in the right direction. They are doing a lot of really impressive, concrete work (like book promotion, EAG, VIP outreach etc), but I think the greatest value in keeping them well funded is to have a vigilant eye watching for obstacles and helping navigate them at this very important, early stage of the movement."

(Thanks for the kind words Jonathon!)

I think Jonathon basically gets to the point here, and I want to extend a bit on what he said in the above comment.

Jonathon says that it is really important to have a group of people watching out for where EA at large is headed, and making appropriate adjustments in its course. This seems reasonable. It seems unlikely that EA could maximize its impact without reflecting on its overall path, since coordination problems are common. But it clearly isn't the case that nobody in EA is reflecting on where we EA is going. Quite the opposite! If the average discussion at EA Global is any indication, then thinking about the overall composition and trajectory of EA is one of the most common topics of conversation in the EA community!

So the question arises, if we already have so many people thinking about where EA is headed, why add additional cooks to the kitchen? And why found a whole organization dedicated to understanding and supporting EAs big-picture trajectory?

I think there are two main reasons for why a dedicated community organization like EAO should exist:

1. Coordination is difficult, and requires infrastructure and time

Right now, the different organizations in EA are doing a pretty good job at coordinating. As has repeatedly been mentioned during EA Global, almost all of the major organizations associated with EA are supporting each other. They encourage new potential hires to first check with other EA organizations to see whether other organizations might have a bigger need for their specific talents. They coordinate on fundraisers to avoid unhealthy competition, and they generally do a good job at exchanging new information and important considerations.

But all of this is coming at a cost. EA organizations are growing rapidly, and it is becoming less and less feasible to have most of the employees of EA organizations talk individually to each other. Judging from the Google Alerts I have set up for Effective Altruism, in the past two months EA has been averaging something like 2 news headline per day. Reading these, processing these, and chasing the implications of each of them takes the EAO team a lot of time. Other organizations cannot spend that many resources. Things like this are distracting them from solving the concrete problems that they want to be working on.

A dedicated community organization can solve this problem. By creating infrastructure, summarizing and consolidating information and facilitating communication between different organizations, such an organization can significantly reduce the cognitive overhead for all other EA organizations. It can create periodical updates on the current state of the EA community, screen the onslaught of information for the most important bits, and keep a constant eye on whether two organizations are significantly duplicating efforts.

And facilitating that kind of coordination takes time. Right now, it is almost a half-time job to keep up with the new developments around EA. In the near future, it will be a full-time job, and soon after that it will take a multi-person team to keep up with the onslaught of information. An organization like EAO can make sure that this effort only needs to be exerted once.

It is important to note that that effort does not have to be exerted solely by a group of individuals on the EAO team. Healthy communities develop a collective intelligence on their own, and systems like the EA Forum, LessWrong or the Facebook upvote function serve as similar information filters that allow the community at large to be kept up to speed without everyone reading through all the information. But for this kind of collective intelligence to exist, we need infrastructure. We need to make sure that platforms like the EA Forum are well-maintained and are used in a way that allows the community at large to understand what is happening in EA. Again, a good community organization will notice that certain infrastructure is missing, and have the available resources and expertise to build whatever is lacking.

2. Thinking rationally about your own tribe is really really hard

One important fact to acknowledge is that being a part of EA encourages the same kind of irrational thought patterns as sport-teams, political parties and other forms of community tend to encourage. EA is a tribe, and thinking about your own tribe is hard. Humans have evolved as social creatures, and we are extremely good at advocating for "fair" rules and guidelines, that "accidentally" end up serving our own interests. There is a whole literature on self-serving bias, and in particular how it extends to our opinion on social rules and guidelines.

This is a problem. This means that most of the time when I come up with an idea for what the EA community at large should do, and what kind of rules and virtues we should be endorsing, it will internally feel like I am proposing fair rules that everyone would obviously agree with, but unconsciously I am nudging the social context in a way that favors me. Noticing this kind of bias is extremely difficult (though some debiasing techniques appear to work at least a bit).

As most of these unconscious biases tend to work, the more hasty we are in our decisions, and the quicker we have to decide, the more we are affected by them. If we don't reflect on the reasons behind our sense of fairness, it is very likely that self-serving motivations will be one of the biggest driving forces behind it. Thinking rigorously, having externally verified frameworks as well as consulting many independent opinions from all over the community and outside of it, all help in mitigating the effect of this bias.

But again, most people and organizations do not have the time to build these kinds of frameworks, or to work through their implicit biases about what the EA community should do. And certainly none of them have the time to run frequent surveys that compile information from inside and outside the community to get access to a balanced viewpoint. Building these kinds of frameworks and expertise takes time. And in this domain our gut judgement will be wrong more often than not, making it key that someone has put the relevant work into this.

A community organization can again solve this problem. Such an organization can take the time to build formal frameworks of how EA works. It can put significant resources into getting a balanced viewpoint by talking to all the different parts of the community, and it can extend its reach out into the world at large and get inputs from experts in community organizing, sociological modeling, cognitive science and many other diverse viewpoints. It can focus on proposing good changes to the EA community, since it doesn't have to split its attention with another problem.

(That said, this is a really hard problem, and I don't know whether it's possible at all to not get sucked into rationalization, political thinking and ingroup outgroup politics. This might just be an intractable problem, though it does seem likely to me that any organization that is not consciously aware of these problems, is going to fall prey to them. )

Summary

Both coordinating large groups of people and setting up an environment to think rationally about your own tribe require a significant investment of time and resources that other EA organizations should not be distracted by. A dedicated community organization can take care of that distraction, and make sure that we create an infrastructure in EA that supports the intellectual development of the community, while taking precautions to not fall prey to self-serving biases when proposing those changes.

Can EAO be that organization?

The key question that is left now, is just: "Does EAO have the talent and capacity to be the organization that I outlined above?"

I think the answer is yes. The EAO team has both shown that it is able to execute on the relevant tasks in the past, and its team composition features a rare combination of skills that makes the current EAO team particularly well suited to the role that I outlined above.

Here is a list of, I think, the most important facts about EAO when trying to assess whether it is suited for the role it is trying to play:

  • We are a part of the Centre for Effective Altruism, which gives us direct access to many EA organizations
  • Our team has expensive experience in organizing events for the EA community:
    • Tyler Alterman has organized dozens of VIP dinners, talks and other events in the EA community
    • I (Oliver Habryka) have helped organize both the EA summit of 2014 and EA Global in 2015, and have been organizing community events for the rationality community for over 2 years, such as the HPMOR wrap party, and the Bay Area Solstice in 2014.
    • Julia Wise has organized meetups for the EA community for over 4 years
    • Peter Buckley has co-founded multiple EA-Chapters at the University of Pennsylvania and has extensive experience in coordinating student chapters
  • The EAO team is very well connected to many different branches of EA. Parts of our team are sharing an office with the Center for Applied Rationality and the Machine Intelligence Research Institute, while being part of CEA directly connects us to everything happening in Britain. By working closely with the EAs in Australia during EA Global, we are also closely connected to the EA community there.
  • We are very well connected not only to the community of active members, but also to the larger network of donors interested in effective interventions. With our work on EA Global, EA Ventures and our general VIP outreach we gained deeper insights into what the larger philanthropic community is interested in, what kind of opportunities entrepreneurs are interested in, and what projects are possible to run in the framework of Effective Altruism.
  • The EAO team has both people with a web-development and web-design background, allowing it to create websites and web-applications from scratch without needing to rely on outside contractors. This significantly speeds up our ability to create infrastructure for the EA community.

I don't think there is right now any other group of people who would be as well suited to the job as the current EAO team is.

How are you going to do it?

Since this article is already quite long, and its purpose is more to explain the bigger picture around EA Outreach, I will try to summarize the concrete projects we have planned for 2016 relatively quickly. Here is another diagram summarizing the projects we are planning for 2016. If you are interested in more detail, please feel free to read our full plan for 2016.

EA Outreach Plan 2016

How do I help?

EAO is right now running its first annual fundraiser, and we are still facing a significant funding gap. Donating money is extremely helpful. We are right now operating at less than a 12 month runway, and though the current members of EAO are extremely comfortable with risks and instability, we would still like to be able to sustain our current level of operations and expand by hiring additional community organizers and building better EA infrastructure. If you think that what we are doing is valuable, please consider donating to us here.

It's also important to note that our current lack of runway creates a lot of strategic uncertainty, which might cause us to make worse decisions than we would have made otherwise. Having reliable funding and a decent runway allows us to build much more reliable infrastructure, since we can secure stability for our systems and new hires.

What is EAO going to do with my money?

If you are interested in a more detailed overview over EAO's expenses, you can read our full plan for 2016 here. The plan has variety of different funding levels, though right now we are still trying to stay above our basic expenses. But for those of you who don't necessarily want to read through another giant wall of text, here is a quick summary:

The money of EAO will be spent on the following things, roughly in this order:

  • Salaries of the current core team
  • Contractors and hires for EA Global, EAGx and independent chapter building
  • Equipment and tech for our web infrastructure and design work
  • Scholarships for the most promising attendees to EA Global

Ok, but what about ...?

If you have any additional questions, please feel free to ask in the comments, send me an email at oliver@eaglobal.org or schedule a Skype chat with our CEO Kerry Vaughan here.

New Comment
17 comments, sorted by Click to highlight new comments since: Today at 11:59 PM

They coordinate on fundraisers to avoid unhealthy competition

Perhaps this is just the LW-sphere, but it seems to me that every org I support (and several I don't) is running a fundraiser at the same time. What does healthy vs. unhealthy competition look like? (Perhaps everyone always does end-of-year fundraising for tax reasons.)

If fundraising was well coordinated, there would still be a large number of orgs raising in December. Something like a fifth of all giving in the US occurs during December. In EA the EtGers in finance find out about their bonuses in December.

However, I didn't know how many orgs were planning to raise money this December. If this was more widely known it might have made sense for a few of the CEA orgs to skip fundraising in December and raise a few months into 2016. So, the amount of coordination now is far from optimal.

[-][anonymous]8y20

FWIW, I am “meh” on EA right now, and I suspect other LW’ers are on the fence as well. After spending some time on the Effective Altruism Forum, here are some worrying trends I’ve seen in the EA movement.

Drifting from rationality (this post), Closed-minded (reaction to this post), Overly-optimistic (this post), Self-congratulatory (this post)

I am especially disappointed that EA seems to be loosening from its rationalist roots so early in its development.

Maybe I am too demanding; any group will occasionally show flaws and the Effective Altruism Forum may not be representative of the entire EA movement. Nevertheless, I am tipping toward pessimism.

I will continue to search for and donate to effective charities, but I am wary to promote myself as part of the current EA movement, or donate to organizations like EAO, due to my concerns. I think other LW’ers have similar reservations.

[-]gjm8y100

I'm puzzled by most of your links.

  • "Drifting from rationality": What's your problem with the post you link to? It seems to me it's simply pointing out that not everyone is a utilitarian, and that whether someone is a utilitarian is a matter of values as well as rationality. What's wrong with that?
  • "Closed-minded": the reaction to that post looks pretty positive to me. (And the post is pretty strange. It proposes creating rat farms filled with very happy rats as a utility-generating tool, and researching insecticides that kill insects in nicer ways.)
  • "Overly-optimistic": that post predicts a 5% chance that within 20 years the whole EA movement might be as big as the Gates Foundation. Do you really find that unreasonable?

I do agree about the fourth link -- but I don't think it's representative, and if you look at reactions on LW to the same author's posts here, you'll see that you're far from the only person who dislikes his style.

[-][anonymous]8y-10
  • Drifting from rationality

From the post, "This means that we need to start by spreading our values, before talking about implementation." Splitting the difficult "effective" part from the easy "altruism" part this early in the movement is troubling. The path of least resistance is tempting.

  • Closed-minded

Karma for the post is relatively low, and a lot of comments, including the top-rated, can be summarized as "Fun idea, but too crazy to even consider."

  • Overly-optimistic

The post glosses over the time value of money/charitable donations and the GWWC member quit rate, so I think it's reasonable to say that the Gates Foundation will almost definitely have moved more time value-adjusted money than that of GWWC's members over the next twenty years. Therefore, speculating that GWWC could be a "big deal" comparable to the Gates Foundation in this time frame is overly-optimistic. Still disagree? Let's settle on a discount rate and structure a hypothetical bet, I'll give you better than 20-1 odds.

  • Self-congratulatory

I don't actually believe this is a big problem in itself, but if the other problems exist it seems like this would exacerbate them.

Karma for the post is relatively low, and a lot of comments, including the top-rated, can be summarized as "Fun idea, but too crazy to even consider."

To be clear, the ideas in question are to establish charities to:

  1. breed rats and then pamper those rats so as to increase the amount of happiness in the world
  2. research insecticides that kill insects in nicer ways

I think that there are legitimate, rational reasons to reject these ideas. I think that you are being uncharitable by assuming that those who responded negatively to those ideas are closed-minded; not every idea is worth spending much time considering.

Those ideas are perfectly rational, given EA premises about maximizing all utility (and the belief that animals have utility). It's just that they're weird conclusions because they are based on weird premises..

Most people would when they encounter such weird conclusions, begin to question the premises, not let themselves get led to their doom. It's possible to bite the bullet too much.

The problem is that "utility" is supposed to stand for what I care about. I don't care about happy rats or happy insects. That is why I am against that kind of project. That is also why eating meat does not bother me, even though I am pretty sure that pigs and cows can and do suffer. I might prefer that they not suffer, other things being equal, but my concern about that is tiny compared to how much I care about humans.

I don't care about happy rats or happy insects.

You might not care about happy rats but a sizable number of EA's care about animal suffering.

If utility stands for what you care about, everyone is a utilitarian by definition. Even if you only care about yourself, that just means that your utility function gives great weight to your preferences and no weight to anyone else's.

[-]gjm8y30

"Utilitarian" doesn't mean "acting according to a utility function". Further, many people's actions are really difficult to express in terms of a utility function, and in order even to try you need to do things like making it change a lot over time and depend heavily on the actions and/or character of the person who's utility function it's supposed to be.

I'm not (I think) saying that to disagree with you; if I'm understanding correctly your first sentence is intended as a sort of reductio ad absurdum of entirelyuseless's comment. But, if so, I am saying the following to disagree with you: I think it is perfectly possible to be basically utilitarian and think animal suffering matters, without finding it likely that happy rat farms and humane insecticides are an effective way to maximize utility. And so far as I know, values of the sort you need to hold that position are quite common among basically-utilitarian people and quite common among people who identify as EAs.

Most people would when they encounter such weird conclusions, begin to question the premises, not let themselves get led to their doom. It's possible to bite the bullet too much.

Great point. It is like the old saying goes:

that which is one person's modus ponens is another person's modus tollens

ETA:

However, none of this is an indictment of EA - one can believe in the principles of EA without also being a strict hedonistic utilitarian. The weird conclusions follow from utilitarianism rather than from EA.

Closed-minded

Karma for the post is relatively low, and a lot of comments, including the top-rated, can be summarized as "Fun idea, but too crazy to even consider."

If a net positive reception is the best example you can bring of EA being close-minded it seems to me that anybody who hasn't looked into the issue of whether EA is open-minded should update in the direction of EA being more open-minded than their priors suggest.

[-]gjm8y-10

spreading our values

The post argues that the most effective way to achieve EA goals is to prioritize spreading EA-ish values over making arguments that will appeal only to people whose values are already EA-ish. I don't know whether that's correct, but I fail to see how figuring out what's most effective and don't it could be an abandonment of rationality in any sense that's relevant here. Taking the path of least resistance -- i.e., seeking maximum good done per unit cost -- is pretty much the core of what EA is about, no?

Karma for the post is relatively low

OK. Inevitably some posts will have relatively low karma. On what grounds do you think this shouldn't have been one of them?

moved more time value-adjusted money [...] over the next twenty years

I don't think that's at all what the post was assigning a 5% probability to.

I agree with the concern about the epistemics of the EA community. I touched on these in a talk I gave at EA Global.

However, I'm not sure linking to isolated posts that are concerning is a good way to get a sense of the degree to which this is a problem in the EA community. You'll want to weight the posts by the actual influence that the poster has over the movement. Of those poster Rob Wiblin is the most influential (he works at CEA). The rest are neither employed at EA orgs nor are large donors.

A community that is both growing and is epistemically strong will probably still have a ton of low-quality posts. This seems normal to me unless we see wider adoption of low-quality ideas. I don't think this is the case so far.

Self-congratulatory

Given that our kind can't cooperate why is being self-congratulatory bad?

Could you clarify why you think that the EA movement is becoming closed-minded? If I understand, it is because there was some negative reaction to the idea expressed here that it would be a good idea to establish a charity to breed rats and then pamper those rats so as to increase the amount of happiness in the world. Is that a correct understanding of your concern?