For effective work on complex topics, multiple people need to work together[citation needed].

For AI Alignment, LessWrong and its sister site, the Alignment Forum, are arguably the Shelling points. But there are other platforms that, for one reason or other - language, culture, size, personal preference - are better suited for individual contributors.    

On Being an individual alignment grantmaker, the following platforms were mentioned: 

Next, I increased my surface area with places which might have good giving opportunities by involving myself with many parts of the movement. This includes Rob Miles’s Discord, AI Safety Support’s Slack [named "AI alignment"], in-person communities, EleutherAI, and the LW/EA investing Discord, where there are high concentrations of relevant people, and exploring my non-LW social networks for promising people.

Other such hubs that I know of (all only with tangential AI safety focus):

Many channels require an invite. If you post additional hubs, please mention if an invite is needed and how one might get invited.

New Answer
Ask Related Question
New Comment

3 Answers sorted by

  • There's also the new Alignment Ecosystem Slack, but that's invite only currently. From the tag: "If you'd like to join message plex with an overview of your involvement."
  • I found a great designer/programmer for one of my alignment projects on the EA Creatives and Communicators Slack.
  • Impact Markets is somewhat relevant.

I am getting ready to help launch two more in the next couple of weeks, one for alignment grantmakers (gated to people who can verify they've directed $10k+ towards alignment), one for software engineers who want to help with alignment. They're not polished yet so not ready for a big announcement, but feel free to turn up early if you're sold on the idea already.

(also, neat, I got cited!)

Other that I find worth mentioning are channels for opportunities at getting started in AI Safety. I know both AGI Safety Fundamentals and AI Safety Camp have slack channels for participants. Invitation needed and you probably need to be a participant to get invited.

There is also a 80000 hours Google group for technical AI safety. Invitation is needed, I can't find that they've broadcasted how to get in so I won't share it. But, they mention it on their website so I guess it is okay to include it here.

I've also heard about research groups in AI safety having their own discords and slack channels. In those cases, to get invited you should probably contact someone at the specific place and show that you have interest in their research. I keep it vague, because again, I don't know how public their existence is.

2 comments, sorted by Click to highlight new comments since: Today at 4:17 AM

I will mention that there is a 'Control Problem' subreddit, not exactly high level discussion but it does cross post a lot of good information from time to time: