For effective work on complex topics, multiple people need to work together.
For AI Alignment, LessWrong and its sister site, the Alignment Forum, are arguably the Shelling points. But there are other platforms that, for one reason or other - language, culture, size, personal preference - are better suited for individual contributors.
On Being an individual alignment grantmaker, the following platforms were mentioned:
Next, I increased my surface area with places which might have good giving opportunities by involving myself with many parts of the movement. This includes Rob Miles’s Discord, AI Safety Support’s Slack [named "AI alignment"], in-person communities, EleutherAI, and the LW/EA investing Discord, where there are high concentrations of relevant people, and exploring my non-LW social networks for promising people.
Other such hubs that I know of (all only with tangential AI safety focus):
- Astral Codex Ten Scott Alexander discusses alignment sometimes, too; has a lively comment section on Substack; also: ACX Discord, Subreddit
- Rationality Berlin Slack of the Berlin LessWrong community; maybe 30 members; one AI Alignment effort; meditation, dojos, monthly meetup, organizes the yearly European LessWrong Community Weekend
- Google Groups
- bayarealesswrong active
- less-wrong-parents inactive
- lesswrong-hamburg group of my meetup in Hamburg, very low activity
- probably many other regional groups
- Bountied Rationality Facebook group, a marketplace for small tasks in the community
Many channels require an invite. If you post additional hubs, please mention if an invite is needed and how one might get invited.
I will mention that there is a 'Control Problem' subreddit, not exactly high level discussion but it does cross post a lot of good information from time to time: https://www.reddit.com/r/ControlProblem/
Gwern often posts to https://www.reddit.com/r/mlscaling/ as well