EA grants available (to individuals)

by Jameson Quinn3 min read7th Feb 20198 comments


Grants & Fundraising Opportunities
Personal Blog

I'm considering applying for some kind of a grant from the effective altruism community. A quick sketch of the specifics is here. Raemon replied there with a list of possibilities. In this post, I'll look into each of those possibilities, to make this process easier for whoever comes next. In the order Raemon gave them, those are:


"Can I apply for a grant?
In general, we expect to identify most giving opportunities via proactive searching and networking. We expect to fund very few proposals that come to us via unsolicited contact. As such, we have no formal process for accepting such proposals and may not respond to inquiries. If you would like to suggest that we consider a grant — whether for your project or someone else’s — please contact us. "

This looks like a case where it's at least partially about "who you know". I do in fact have some contacts I could approach in this regard, and I may do so as this search proceeds.

But this does seem like a bias that it would be good to try to reduce. I understand that there are serious failure modes for "too open" as well as "too closed", but based on the above I think it currently tilts towards the latter. Perhaps a publicly-announced process for community vetting? I suspect there are people who are qualified and willing to help sort the slush-pile that would create.

CEA (Center for Effective Altruism)

Applications for the current round have now closed. If you’d like to be notified when applications next open, please submit your contact information through this form.
The goal of Effective Altruism Grants is to enable people to work on projects that will contribute to solving some of the world’s most important problems. We are excited to fund projects that directly contribute to helping others, as well as projects that will enable individuals to gain the skills needed to do so.
CEA only funds projects that further its charitable objects.[1] However, we welcome applications that may be of interest to our partners who are also looking to support promising projects. Where appropriate, we have sometimes passed applications along to those partners.

This would seem to be a dead end for my purposes in two regards. First, applications are not currently open, and it's not clear when they will be. And second, this appears to focus on projects with immediate benefits, and not meta-level basic research like what I propose.

BERI (Berkeley Existential Risks Initiative) individual grants

BERI’s Individual Grants program focuses on making grants to individuals or teams of individuals, rather than to organizations. There are several types of individual grants programs that BERI expects to run, such as:
Individual Project Grants are awarded to individuals to carry out projects directly in service of BERI’s mission.
Individual Level-Up Grants are awarded to individuals to carry out projects or investigations to improve the skills and knowledge of the grantee, with hopes that they will carry out valuable work for BERI’s mission in the future.
What is the process for obtaining an individual grant from BERI?
Typically, BERI will host “rounds” for its various individual grants programs. Details about how to apply will be in the announcement of the round.... If you would like to be notified when BERI is running one of the above grants rounds, please send an email to individual-grants@existence.org noting which type of grant round you are interested in.

Another dead end, at the moment, as applications are not open.

EA funds

There are 4 funds (Global Development, Animal Welfare, Long-Term Future, and Effective Altruism Meta). Of these 4, only Long-Term Future appears to have a process for individual grant applications, linked from its main page. (Luckily for me, that's the best fit for my plan anyway.)

We are particularly interested in small teams and individuals that are trying to get projects off the ground, or that need less money than existing grant-making institutions are likely to give out (i.e. less than ~$100k, but more than $10k). Here are a few examples of project types that we're open to funding an individual or group for (note that this list is not exhaustive):
+ To spend a few months (perhaps during the summer) to research an open problem in AI alignment or AI strategy and produce a few blog posts or videos on their ideas
+ To spend a few months building a web app with the potential to solve an operations bottleneck at x-risk organisations
+ To spend a few months up-skilling in a field to prepare for future work (e.g. microeconomics, functional programming, etc).
+ To spend a year testing an idea that has the potential to be built into an org.

This is definitely the most promising for my purposes. I will be applying with them in the near future.


I'm looking for funds in the $10K-$100K range for a short-term project that would probably fall through the gaps of traditional funding mechanisms — an individual basic research project. It seems the EA community is trying to address this issue of funding this kind of project in a way that has fewer arbitrary gaps while still having rigorous standards. Nevertheless, I think that the landscape I surveyed above is still fragmented in arbitrary ways, and worthy projects are still probably falling through the gaps.

Raemon suggested in a comment on my earlier post that "something I'm hoping can happen sometime soon is for those grantmaking bodies to build more common infrastructure so applying for multiple grants isn't so much duplicated effort and the process is easier to navigate, but I think that'll be awhile". I think that such "common infrastructure" would help a more-unified triage process so that the best proposals wouldn't fall through the cracks. I think this benefit would be even greater than the ones Raemon mentioned (less duplicated effort and easier navigation). I understand that this refactoring takes time and work and probably won't be ready in time for my own proposal.

PS: see also this website on AI alignment funding options, which came up in comments.