Many of you know about Effective Altruism and the associated community. It has a very significant overlap with LessWrong, and has been significantly influenced by the culture and ambitions of the community here.
One of the most important things happening in EA over the next few months is going to be EA Global, the so far biggest EA and Rationality community event to date, happening throughout the month of August in three different locations: Oxford, Melbourne and San Francisco (which is unfortunately already filled, despite us choosing the largest venue that Google had to offer).
The purpose of this post is to make a case for why it is a good idea for people to attend the event, and to serve as an information hub for information that might be more relevant to the LessWrong community (as well an additional place to ask questions). I am one of the main organizers and very happy to answer any questions that you have.
Is it a good idea to attend EA Global?
This is a difficult question, that obviously will not have a unique answer, but from the best of what I can tell, and for the majority of people reading this post, the answer seems to be "yes". The EA community has been quite successful at shaping the world to the better, and at building an epistemic community that seems to be effective at changing its mind and updating on evidence.
But there have been other people arguing in favor of supporting the EA movement, and I don't want to repeat everything that they said. Instead I want to focus on a more specific argument: "Given that I belief that EA is overall a promising movement, should I attend EA Global if I want to improve the world (according to my preferences)?"
The key question here is: Does attending the conference help the EA Movement succeed?
How attending EA Global helps the EA Movement succeed
It seems that the success of organizations is highly dependent on the interconnectedness of its members. In general a rule seems to hold: The better connected the social graph of your organization is, the more effective does it work.
In particular, any significant divide in an organization, any clustering of different groups that do not communicate much with each other, seems to significantly reduce the output the organization produces. I wish we had better studies on this, and that I could link to more sources for this, but everything I've found so far points in this direction. The fact that HR departments are willing to spend extremely large sums of money to encourage the employees of organizations to interact socially with each other, is definitely evidence for this being a good rule to follow (though far from conclusive).
What holds for most organizations should also hold for EA. If this is true, then the success of the EA Movement is significantly dependent on the interconnectedness of its members, both in the volume of its output and the quality of its output.
But EA is not a corporation, and EA does not share a large office together. If you would graph out the social graph of EA, it would very much look clustered. The Bay Area cluster, the Oxford cluster, the Rationality cluster, the East Coast and the West Coast cluster, many small clusters all over Europe with meetups and small social groups in different countries that have never talked to each other. EA is splintered into many groups, and if EA would be a company, the HR department would be very justified in spending a very significant chunk of resources at connecting those clusters as much as possible.
There are not many opportunities for us to increase the density of the EA social graph. There are other minor conferences, and online interactions do some part of the job, but the past EA summits where the main events at which people from different clusters of EA met each other for the first time. There they built lasting social connections, and actually caused these separate clusters in EA to be connected. This had a massive positive effect on the output of EA.
- Ben Kuhn put me into contact with Ajeya Cotra, resulting in the two of us running a whole undergraduate class on Effective Altruism, that included Giving Games to various EA charities that was funded with over $10.000. (You can find documentation of the class here).
- The last EA summit resulted in both Tyler Alterman and Kerry Vaughan being hired by CEA and now being full time employees, who are significantly involved in helping CEA set up a branch in the US.
- The summit and retreat last year caused significant collaboration between CFAR, Leverage, CEA and FHI, resulting in multiple situations of these organizations helping each other in coordinating their fundraising attempts, hiring processes and navigating logistical difficulties.
This is going to be even more true this year. If we want EA to succeed and continue shaping the world towards the good, we want to have as many people come to the EA Global events as possible, and ideally from as many separate groups as possible. This means that you, especially if you feel somewhat disconnected from EA, seriously want to consider coming. I estimate the benefit of this to be much bigger than the cost of a plane ticket and the entrance ticket (~$500). If you do find yourself significantly constrained by financial resources, consider applying for financial aid, and we will very likely be able to arrange something for you. By coming, you provide a service to the EA community at large.
How do I attend EA Global?
As I said above, we are organizing three different events in three different locations: Oxford, Melbourne and San Francisco. We are particularly lacking representation from many different groups in mainland Europe, and it would be great if they could make it to Oxford. Oxford also has the most open spots and is going to be much bigger than the Melbourne event (300 vs. 100).
If you want to apply for Oxford go to: eaglobal.org/oxford
If you want to apply for Melbourne go to: eaglobal.org/melbourne
If you require financial aid, you will be able to put in a request after we've sent you an invitation.