If you are interested in supporting the development of research projects and researchers in AI safety, the MATS team is currently hiring for Community Manager, Research Manager, and Operations Generalist roles and would love to hear from you. Please apply by Nov 3 to join our team for the Winter 2024-25 Program over the period of Dec 9-Apr 11. Most open roles have the possibility of long-term extension depending on performance and program needs.
Introduction
The ML Alignment & Theory Scholars (MATS) program aims to find and train talented individuals for what we see as the world’s most urgent and talent-constrained problem: reducing risks from unaligned artificial intelligence. We believe that ambitious young researchers from... (read 1479 more words →)