Looking for AI Safety Experts to Provide High Level Guidance for RAISE

by ofer1 min read6th May 20185 comments

17

CommunityAI
Frontpage

The Road to AI Safety Excellence (RAISE) initiative aims to allow aspiring AI safety researchers and interested students to get familiar with the research landscape effectively; thereby hopefully increasing the number of researchers that contribute to the field. To that end, we (the RAISE team) are trying to build a high-quality online course. You can see our pilot lesson here (under “Corrigibility 1”).

Most of the course segments will be based on distilled summaries of one or more papers. We already distilled ~9 papers on corrigibility for the first course segments, and used the distilled summaries to write video script drafts.

Our long-term goal is to cover as much of the AI safety research landscape as possible, in the most useful way possible. Therefore, we need guidance from experts who have extensive familiarity with the literature in one of the broad subfields of AI safety (i.e. the machine learning perspective or the Agent Foundations research agenda; or broad parts thereof). We realize that the time of such experts is a critically scarce resource. Therefore, we will ask them only for high-level guidance including:

1) Their idea of a good structure for a part of the course: a list of sections, and the subsections that might constitute each one.

2) Pointers to papers to base each subsection on.

If an expert expects contributing further to RAISE to be an effective use of their time, they could also choose to go over our lesson scripts and provide feedback before the videos are being recorded.

Should this role be an effective use of your time, please contact us at raise@aisafety.camp

5 comments, sorted by Highlighting new comments since Today at 7:50 PM
New Comment

I think that we should schedule a video chat. I might have a lot of content for you. Email me?

I've sent you an email, thanks!

Within RAISE there's a team that's working on constructing a prerequisites track for AI safety - something that people who lack some of the necessary undergraduate-level background could use to (1) identify what material they're missing; and (2) learn that material effectively.

The post you linked to is part of that project.

(note: I might be slightly misrepresenting that project - I'm not on the team that works on it)

You represented it well. We're currently doing 2 things at once. The prerequisites track was too good to pass up.