At EA UC Berkeley, we’re launching an ongoing series of contests called the Artificial Intelligence Misalignment Solutions (AIMS) series. This second contest, the Distillation Contest, is now open to any student enrolled in a university/college: here are our interest and submission forms! The contest has prizes as large as $2,500 and closes on May 20th. If you’re interested, please fill out the interest form so that I can send you updates on resources and changes in the contest! This blog post restates the information that is on our website, with a bit more explanation of the contest's purpose. 

(This is cross-posted on the EA Forum and Lesswrong.)


AIMS Series

I think that it is currently difficult for university students to find tangible ways to engage with AI Safety. Generally, by creating a series of AI Safety contests, I hope to:

  • Help build social capital for students who are interested in Alignment and potentially good at it.
  • Create ways for people to test their fit for Alignment work.
  • Create a “brand” around my contests over time so that CS students recognize its name and winners recommend the contests to their friends. Hopefully, this name recognition would also increase the ability to create partnerships with CS orgs as well.

For this specific contest, I’m inspired by the arguments that the field of AI Alignment needs more distillers to improve communication within the field, as well as to make their research accessible to a wider audience. The Distillation Contest aims to produce value by:

  • Recruiting CS students who have never heard of EA or Alignment before (I will be doing this outreach at UC Berkeley through advertising, but other organizers are welcome to advertise to their own groups for recruitment).
  • Increasing the engagement of students who are already interested in Alignment.
  • Potentially producing useful distillations of Alignment research and increasing accessibility to said research.

Contest description

The Distillation Contest asks that participants: 

  • 1) Pick an article/post/research paper on AI Alignment/Safety (ideally from our list below) that would benefit from being more clearly explained.
  • 2) Indicate which ideas or sections of their chosen research should be distilled. Applicants can either distill a whole post/article, a specific part of the post/article, or multiple posts/articles.
  • 3) Create a distillation: a clearer explanation of the research, along with a new example or new application of the research.
  • 4) Optionally: If there is a problem that is trying to be solved by the research you’re distilling, you can attempt to create an additional solution to the problem and include it in your response.

What makes a good distillation?

A good distillation would explain the most confusing part of another piece of writing  – the use of distillation is found in creating new ways to understand confusing concepts or confusing technical writing.  These distillations would also help readers infer how the distilled ideas relate to other Alignment research. Because of this, creating a good distillation will likely require participants to read related research outside of their distilled post in order to make sure they fully understand the ideas presented in the paper. 

As an example of a great distillation, Holden Karnofsky, after creating the Most Important Century Series, created a roadmap to make the series more digestible and navigable. Additionally, Scott Alexander has distilled multiple complex dialogues (and even a meme) in order to make them more accessible. 

Posts/articles that we would encourage applicants to choose for the Distillation Contest to distill include the following list. Applicants are allowed to propose their own posts/articles outside of this list, although it’s possible that the judges will not believe that those articles are convoluted enough to need distillation. Therefore, it’s recommended that applicants distill from the list below.


$2,500 - One prize available for 1st place submission.

$1,250 - One prize available for 2nd place submission.

$500 - Up to 5 prizes available.

$250 - Up to 10 prizes available.

All prize winners’ names will be posted on the EA Berkeley website and selected distillations will be optionally posted to the website. 


April 13th - Interest and Submission Forms Open

It is not required to fill out the interest form in order to enter the contest. Filling out the interest form will add you to an email list that will provide you with information about opportunities to further engage with others about this contest.

May 20th - Deadline to submit by 11:59 PM.

Late May/Early June - Winners announced, prizes allocated, names added to the website.


Distillations will be scored on the following factors:

  • Depth of understanding
  • Clarity of presentation
  • Rigor of work
  • Concision/Length (longer papers will need to present more information than shorter papers)
  • Originality of insight
  • Accessibility

Preference may be given to distillations that:

  • Synthesize multiple sources
  • Increase the ease of access for the distillation to be an introduction to a topic

Final Notes

There are a few other purposes to this contest that I did not list above but may write about in a future forum post on the EA forum! There are also likely some great articles that should be distilled in addition to the collection of the current list of recommended articles to distill (which were chosen by Akash Wasil). If you have any top recommendations for articles you'd like to be distilled, I may make additions to our existing list so that applicants have a higher chance of distilling that article. 

Finally, since the contest is open to all students, please feel free to share our contest information with university students you know! Here is a link to our current advertising material for other organizers to distribute if they'd like. If you’d like to advertise, please be sure to advertise the interest form, since that's how I’ll be sending out reminders about the contest deadlines and gathering emails to form a database of students interested in AI Safety.

New to LessWrong?

New Comment
15 comments, sorted by Click to highlight new comments since: Today at 1:14 PM

Great initiative! I'm surprised that PDF is the only accepted submission format, given that folks in ML research distillation are pushing for interactive explanations and there are already several great examples (  Any reason for that?

Oh, that’s a great point! I’m glad to open to other formats for submissions! PDF was mainly to make the distillations easy to read (as opposed to other formats) because I presumed they would be written. Do you have any recommended formats I open it to, or do you suggest lifting any limitations on format is generally is the best move?

LessWrong/Alignment Forum posts!! I assume here is a natural place to share Alignment distillation and it'll be much more convenient for people to read the summaries directly in posts rather than having to click through to a PDF (clicking through to links is a trivial inconvenience that really does result in much fewer views). LessWrong posts also have the advantage of a comment section right below.

Among the plans I have is to somehow make LessWrong much better support the creation and display of distillation content, and having good distillation posts would help towards that.

I suppose it could be either a PDF or a web page? A web page is definitely something anyone can open and read :) For people who consider submitting an interactive web page, you could suggest them to use the Distill infrastructure and look at existing examples ( One advantage of doing so is that they could then submit their paper to the Distill journal and get more exposure.

I've adjusted the submission form so that it's open to attachments and links now! Thank you to everyone who added thoughts on this! Soon I'll add some notes to my website to encourage people to make posts and apply to Distill's journal :) 


I have a few questions:

Do we need to study at a US university in order to participate? I’m in Europe.

Who should be the target audience for the posts? CS students? The average LW reader? People somewhat interested in AI alignment? The average Joe? How much do we need to dumb it down?

Can we publish the posts before the contest ends?

Will you necessarily post the winners' names? Can we go by a pseudonym instead?

How close to the source material should we stay? I might write a post about what value learning is, why it seems like the most promising approach and why it might be solvable, which would involve explaining a few of John Wentworth’s posts. But I don't think my reasoning is exactly the same as his.

Also, is there any post that whoever is reading this comment tried and failed to understand? Or better yet, tried hard to understand but found completely impenetrable? If so, what part did you find confusing? If I choose to participate and try to explain that post, would you volunteer to read a draft to check that I’m explaining it clearly?

You can participate from any country! The submission should just be in English (but, if you'd like to additionally submit in another language, that could be cool to increase accessibility).

The target audience would ideally be a few levels lower than the audience of the original piece. If it was originally written for alignment researchers, then making the paper accessible to a CS student is good, but making it accessible to people who have never heard of alignment could be even better (although certainly harder). This is what the "accessibility" scoring point would be focusing on, but it's far from the only factor. 

You can publish the post before the contest ends, but please indicate that the post is intended to be part of the contest! We'd ideally like to be encouraging people to create content they wouldn't have without the contest. 

A pseudonym works! Publishing names would mainly be to help students build social capital, so it's an optional part of winning. 

I think that as long as you faithfully explain parts of the original posts you would be very welcome to add to the discussion.

(Would also love to hear answers to the last question!)

Is it only open to students? I am finished with my degree and got a job, but would like to participate, is this possible?

Unfortunately, this contest is only open to students. I may host a larger distillation contest in the future, but this one was designed to increase outreach to university students since contests including professionals may be too intimidating. Thank you for your interest, though!

This looks awesome! Is the contest open to master's and PhD students or just undergraduates? Apologies if this is spelled out somewhere and I just missed it.

Thank you! This contest is open to PhDs and Master's students as well! I tried not to say it directly in the advertising material so that undergrads may be less intimidated, but thank you for clarifying!

Do you have a poster that can be put up on campuses to spread the information?


 Yes! Apologies, none of my links carried over to this post, so I'll edit them in now. The poster will be linked in the final notes section. 

Would one be allowed to make multiple submissions distilling different posts? I don't know if I would necessarily want to do that, but I'm at least curious about the ruling.