I’m Michael Aird, a Researcher at Rethink Priorities, Research Scholar at the Future of Humanity Institute, and guest manager at the Effective Altruism Infrastructure Fund. Opinions expressed are my own. You can give me anonymous feedback at this link.
With Rethink, I'm currently mostly working on nuclear risk research and AI governance & strategy research.
I mostly post to the EA Forum.
If you think you or I could benefit from us talking, feel free to message me or schedule a call. For people interested in doing effective-altruism-related research/writing, testing their fit for that, "getting up to speed" on EA/longtermist topics, or writing for the EA Forum/LessWrong, I also recommend this post.