Open Philanthropy is seeking proposals for outreach projects

by abergal, ClaireZabel15 min read16th Jul 20212 comments

60

World Optimization
Frontpage

[Cross-posted from the EA Forum.]

Open Philanthropy is seeking proposals from applicants interested in growing the community of people motivated to improve the long-term future via the kinds of projects described below.[1]

Apply to start a new project here; express interest in helping with a project here.

We hope to draw highly capable people to this work by supporting ambitious, scalable outreach projects that run for many years. We think a world where effective altruism, longtermism, and related ideas are routine parts of conversation in intellectual spaces is within reach, and we’re excited to support projects that work towards that world.

In this post, we describe the kinds of projects we’re interested in funding, explain why we think they could be very impactful, and give some more detail on our application process.

Proposals we are interested in

Programs that engage with promising young people

We are seeking proposals for programs that engage with young people who seem particularly promising in terms of their ability to improve the long-term future (and may have interest in doing so).

Here, by “particularly promising”, we mean young people who seem well-suited to building aptitudes that have high potential for improving the long-term future. Examples from the linked post include aptitudes for conducting research, advancing into top institutional roles, founding or supporting organizations, communicating ideas, and building communities of people with similar interests and goals, among others. Downstream, we hope these individuals will be fits for what we believe to be priority paths for improving the long-term future, such as AI alignment research, technical and policy work reducing risks from advances in synthetic biology, career paths involving senior roles in the national security community, and roles writing and speaking about relevant ideas, among others.

We’re interested in supporting a wide range of possible programs, including summer or winter camps, scholarship or fellowship programs, seminars, conferences, workshops, and retreats. We think programs with the following characteristics are most likely to be highly impactful:

  • They engage people ages 15 - 25 who seem particularly promising in terms of their ability to improve the long-term future, for example people who are unusually gifted in STEM, economics, philosophy, writing, speaking, or debate.
  • They cover effective altruism (EA), rationality, longtermism, global catastrophic risks, or related topics.
  • They involve having interested young people interact with people currently working to improve the long-term future.

Examples of such programs that Open Philanthropy has supported include SPARC, ESPR, the SERI and FHI summer research programs, and the recent EA Debate Championship. However, we think there is room for many more such programs.

We especially encourage program ideas which:

  • Have the potential to engage a large number of people (hundreds to tens of thousands) per year, though we think starting out with smaller groups can be a good way to gain experience with this kind of work.
  • Engage with groups of people who don’t have many ways to enter relevant intellectual communities (e.g. they are not in areas with high concentrations of people motivated to improve the long-term future).
  • Include staff who have experience working with members of the groups they hope to engage with—in particular, experience talking with young people about new ideas while being respectful of their intellectual autonomy and encouraging independent intellectual development.

We encourage people to have a low bar for submitting proposals to our program, but note that we view this as a sensitive area: we think programs like these have the potential to do harm by putting young people in environments where they could have negative experiences. Nicole Ross at the Centre for Effective Altruism (email nicole@centreforeffectivealtruism.org) is available to provide advice on these kinds of risks.

Some reasons why we think this work has high expected value

A priori, we would guess that people are more likely to get interested in new ideas and opportunities when they are relatively young and have fewer preexisting commitments. This guess is consistent with the results of a survey Open Philanthropy recently ran—we surveyed approximately 200 people who our advisors suggested had the potential to do good longtermist work, most of whom had recently made career changes that we thought were positive from a longtermist perspective. As part of this survey, we asked respondents several questions regarding the age at which they first encountered effective altruism or effective altruism-adjacent ideas.

  • On average, survey respondents reported first encountering EA/EA-adjacent ideas when they were 20 years of age.
  • About 25% of respondents first encountered EA/EA-adjacent ideas at ages 18 or below, even though few EA outreach projects focus on that age range.
  • On average, respondents said the best age for them to first encounter EA/EA-adjacent ideas would have been 16.

Survey respondents often mentioned that hearing about EA before starting university would have been particularly helpful because they could have planned how to use their time at university better, e.g. what to major in.

We also asked survey respondents to brainstorm open-endedly about how to get people similar to them interested in these ideas. 10% of responses mentioned starting outreach programs younger, particularly in high school. Several respondents mentioned that SPARC and ESPR had been helpful for them and that they would recommend these programs to similar people. (Certain other high school outreach projects have reported less success, but we don’t think these less-targeted programs provide much evidence about how promising targeted high school outreach is likely to be overall, as discussed here.)

Our survey also showed that EA groups, particularly university groups, have had a lot of impact on longtermist career trajectories. On a free-form question asking respondents to list the top few things that increased their expected impact, respondents listed EA groups more commonly than any other factor. On other measures of impact we used in our survey analysis, EA groups came between second and fourth in potential factors, above many EA organizations and popular pieces of writing in the EA-sphere. Most of this impact (65 - 75% on one measure) came from university groups. We think this suggests that, more generally, offering high-quality opportunities for university students to get involved is a promising kind of intervention.

Made-up examples of programs we think could be impactful

These examples are intended to be illustrative of the kinds of programs we’d be interested in funding. This is not intended to be a comprehensive list, nor a list of the programs we think would be most impactful.

We think these programs are unlikely to work fully as written. Founders generally have to dive deep into a project plan to figure out what’s tenable, altering their plan multiple times as they get a better understanding of the space, and we haven’t done that work. As such, we’d like these examples to serve as inspiration, not as instructions. We think programs of this kind are more likely to be successful when the founders develop their own vision and understanding of their target audience.

We would ultimately like to support dedicated teams or organizations that run programs for young people at scale. That said, we are likely to recommend that applicants with less of a track record start by trying out a small pilot of their program and iterating while maximizing program quality and target fit, rather than scaling immediately.

Example 1: A free two-week summer school in Oxford that teaches content related to longtermism to promising high school students. The program could have a similar structure to SPARC and ESPR, but with a more explicitly longtermist focus, and it could engage a broader range of gifted high school students.

  • We think programs like this are most effective when they focus on highly promising students, e.g. by filtering on Olympiad participation, high standardized test scores, competitive awards, or other markers of talent.
  • Oxford seems like a good location for programs like this because its status as an EA hub makes it easy for current longtermists doing good work to instruct and interact with students, which we think is important for programs like this to be successful. (Berkeley and Stanford seem like good locations for similar reasons.)
  • Oxford is a cool place to visit in and of itself, making a program located there attractive as a paid trip for high school students.

Example 2: A monthly AI safety workshop for computer science undergraduates, covering existing foundational work in AI safety.

  • There have been several programs like this, notably AIRCS, which our survey suggests has had an impact on some longtermist career trajectories. We think it’s likely that AIRCS hasn’t saturated the pool of top computer science undergraduates, and that there is room for more programs of this form that experiment with different kinds of content and instructors.

Example 3: A one-week summer program about effective altruism in Berkeley combined with a prestigious $20,000 merit-based scholarship for undergraduate students. The scholarship would involve an application process that required substantial engagement with ideas related to effective altruism, e.g. a relevant essay and an interview.

  • We think the best scholarship programs will be fairly selective, so as to attract very promising applicants and create a very strong cohort.
  • In-person programs that run right before students start their undergraduate degrees might be particularly impactful, via bolstering EA groups at top universities.
  • Scholarships and other programs that include substantial financial opportunities risk attracting applicants that are only interested in the money provided by the program. We think programs like this should construct application processes that make an effort to identify applicants genuinely interested in effective altruism, e.g. via essays and interviews.

Example 4: A monthly four-day workshop teaching foundational rationality content to promising young people.

  • The workshop could teach foundational technical topics in rationality, including some covered by CFAR in the past, e.g. probability theory, Bayesianism, Fermi estimation, calibration, betting, cognitive biases, etc., as well as exercises intended to help students use these thinking tools in the real world.
  • This could overlap heavily with SPARC’s content, but could engage a larger number of people per year than SPARC has capacity for, as well as a more varied or substantively different audience.

Example 5: A fall jobs talk and follow-up discussion that’s held at top universities describing career paths in defensive work for future biological catastrophes.

  • We think fall final year is a good time to prompt undergraduate students with concrete career suggestions, and with COVID-19 in recent memory, we think the next few years could be a particularly good time to talk to students about careers in global catastrophic biological risk reduction.

Projects aiming at widespread dissemination of relevant high-quality content

We are also seeking proposals for projects that aim to share high-quality, nuanced content related to improving the long-term future with large numbers of people. Projects could cover wide areas such as effective altruism, rationality, longtermism, or global catastrophic risk reduction, or they could have a more specific focus. We’re interested in supporting people both to create original content and to find new ways to share existing content.

Potential project types include:

  • Podcasts
  • YouTube channels
  • Massive open online courses (MOOCs)
  • New magazines, webzines, blogs, and media verticals
  • Books, including fiction
  • Strategic promotion of existing content (with the permission of the creators of the content, or their representatives), especially those that have historically drawn in promising individuals

Existing projects along these lines include the 80,000 Hours Podcast, Robert Miles’s AI alignment YouTube channel, and Vox’s Future Perfect.

We encourage projects that involve content in major world languages other than English, especially by native speakers of those languages—we think projects in other languages are especially likely to reach people who haven’t had as many opportunities to engage with these ideas.

We would like interested people to have a low bar for submitting a proposal, but we think projects that misrepresent relevant ideas or present them uncarefully can do harm by alienating individuals who would have been sympathetic to them otherwise. We also think it’s important to be cognizant of potential political and social risks that come with content creation and dissemination projects in different countries. Nicole Ross at the Centre for Effective Altruism (email nicole@centreforeffectivealtruism.org) is available to provide advice on these kinds of risks.

Some reasons why we think this work has high expected value

Our sense from talking to people doing longtermist work we think is promising has been that, for many, particular pieces of writing or videos were central to their turn towards their current paths.

This seems broadly in line with the results of the survey we conducted mentioned above. The bodies of written work of Nick Bostrom, Eliezer Yudkowsky, and Peter Singer were in the top 10 sources of impact on longtermist career trajectories (of e.g. organizations, people, and bodies of work) across several different measures. On one measure, Nick Bostrom’s work by itself had 68% of the impact of the most impactful organization and 75% of the impact of the second most impactful organization. When asked what outreach would attract similar people to longtermist work, 8% of respondents in the survey gave free-form responses implying that they think simply exposing similar people to EA/EA-adjacent ideas would be sufficient.

These data points suggest to us that even absent additional outreach programs, sharing these ideas more broadly could ultimately result in people turning towards career activities that are high-value from a longtermist perspective. For many who could work on idea dissemination, we think increasing the reach of existing works with a strong track record, like those given above, may be more impactful per unit of effort than creating new content.

Made-up examples of projects we think could be impactful

As above, these examples are intended to be illustrative of the kinds of programs we’d be interested in funding. This is not intended to be a comprehensive list, nor a list of the programs we think would be most impactful. We think these programs are unlikely to work fully as written and would like these projects to serve as inspiration, not as instructions.

Example 1: Collaborations with high-profile YouTube creators to create videos covering longtermist topics.

  • We think YouTube is an attractive promotional platform because different creators come with different audiences, making it easy to share with the kinds of people who most often become interested in longtermist ideas.

Example 2: Targeted social media advertising of episodes of the 80,000 Hours podcast. The project would aim to maximize downloads of the 80,000 Hours Podcast episodes that go through social media referrals.

  • The 80,000 Hours podcast seems promising to promote because we think it’s high-quality, quick to consume, and varied enough in content to appeal to a fairly wide audience.
  • The project could experiment with indiscriminately advertising podcast episodes to promising early-career individuals, e.g. STEM, economics, or philosophy students, or with advertising select podcast episodes on particular topics to audiences that they may appeal to.
  • Any project of this form should be done in collaboration with 80,000 Hours.

Example 3: A website that delivers free copies of physical books, e-books, or audiobooks that seem helpful for understanding how to do an outsized amount of good to people with a .edu email address who request them.

  • The bulk of the project work could be focused on website design and advertising, while book distribution could be handled through EA Books Direct, or done as part of this project.

Example 4: A MOOC covering existing AI safety work.

Example 5: A new magazine that covers potentially transformative technologies and ways in which they could radically transform civilization in positive or negative ways.

Application process

Primary application

If you think you might want to implement either of the kinds of outreach projects listed above, please submit a brief pre-proposal here. If we are interested in supporting your project, we will reach out to you and invite you to submit more information. We encourage submissions from people who are uncertain if they want to found a new project and just want funding to seriously explore an idea. If it would be useful for applicants developing their proposals, we are open to funding them to do full-time project development work for 3 months. We are happy to look at multiple pre-proposals from applicants who have several different project ideas.

We may also be able to help some applicants (e.g. by introducing them to potential collaborators, giving them feedback about plans and strategy, providing legal assistance, etc.) or be able to help find others who can. We are open to and encourage highly ambitious proposals for projects that would require annual budgets of millions of dollars, including proposals to scale existing projects that are still relatively small.

We intend to reply to all applications within two months. We have also been in touch with the Effective Altruism Infrastructure Fund and the Long-Term Future Fund, and they have expressed interest in funding proposals in the areas we describe below. If you want, you can choose to have them also receive your application via the same form we are using.

There is no deadline to apply; rather, we will leave this form open indefinitely until we decide that this program isn’t worth running, or that we’ve funded enough work in this space. If that happens, we will update this post noting that we plan to close the form at least a month ahead of time.

Collaborator application

If you aren’t interested in starting something yourself, but you would be interested in collaborating on or helping with the kinds of outreach projects listed above (either full or part-time), let us know here. We will connect you to project leads if we feel like there is a good fit for your skills and interests.

If you have any questions, please contact longtermfuture-outreach-rfp@openphilanthropy.org.


  1. Our work in this space is motivated by a desire to increase the pool of talent available for longtermist work. We think projects like the ones we describe may also be useful for effective altruism outreach aimed at other cause areas, but we (the team running this particular program, not Open Philanthropy as a whole) haven’t thought through how valuable this work looks from non-longtermist perspectives and don’t intend to make that a focus. ↩︎

60

2 comments, sorted by Highlighting new comments since Today at 8:12 AM
New Comment

Regarding your podcast example, I have some thoughts:

Psychometrics is both correct and incredibly unpopular - this means there is possibly an arbitrage here for anyone willing to believe in it.

Very high IQ people are rare and often have hobbies that are considered low-status in the general population. Searching for low-status signals that are predictive of cognitive ability looks to be an efficient means of message targeting. 

It is interesting to note that Demis Hassibias’s prodigious ability was obvious to anyone paying attention to board games competitions in the late 90s. It may have been high ROI to sponsor the Mind Sports Olympiad at that time just for a small shot at influencing someone like Demis. There are likely other low-status signals of cognitive ability that will allow us to find diamonds in the rough. 

Those who do well in strategic video games, board games, and challenging musical endeavors may be worth targeting. (Heavy metal for example - being very low-status and extremely technical musically - is a good candidate for being underpriced).

With this in mind, one obvious idea for messaging is to run ads. Unfortunately, high-impact people almost certainly have ad-blockers on their phones and computers. 

However, the podcast space offers a way around this. Most niche 3rd party apps allow podcasters to advertise their podcasts on the podcast search pages. On the iPhone, at least, these cannot be adblocked trivially.

As the average IQ of a 3rd-party podcast app user is likely sligher higher than those who use first-party podcast apps, the audience is plausibly slightly enriched for high-impact people already. By focusing ads on podcast categories that are both cheap and good proxies for listener’s IQs (especially of the low-status kind mentioned above) one may be able to do even better.

I have been doing this for the AXRP podcast on the Overcast podcast app, and it has worked out to about ~5 dollars per subscriber. I did this without asking the permission of the podcast's host.

Due to the recurring nature of podcasts and the parasocial relationship podcast listeners develop to the hosts of podcasts, it is my opinion their usefulness as a propaganda and inculcation tool is underappreciated at this time. It is very plausible to me that 5 dollars per subscriber may indeed be very cheap for the right podcast. 

Directly sponsoring niche podcasts with extremely high-IQ audiences may be even more promising. There are likely mathematics, music theory, games and puzzle podcasts that are small enough to have not attracted conventional advertisers but are enriched enough in intelligent listeners to be a gold mine from this perspective. 

I do not think I am a particularly good fit for this project. My only qualification is I am the only person I am aware of who is running such a project. Someone smarter with a better understanding of statistics would plausibly do far better. Perhaps if you have an application by a higher-quality person with a worse idea, you can give them my project. Then I can use my EA budget on something even crazier! 

I think some of these are really neat and interesting ideas. I will keep them in mind, but also encourage you to think about whether you might actually be the best fit for this project (as far as I know no one has done it so far but you, it's hard for people to absorb one anothers' models and enthusiasms, I doubt we will get a ton of applicants).