We're giving away 100 Aerolamp DevKits, a lamp that kills germs with far-UVC. Are you sick of getting sick in your group house? Want to test out fancy new tech that may revolutionize air safety? Claim your Aerolamp What is far-UVC? Far-UVC is a specific wavelength of ultraviolet light that...
TL;DR: Mox, a community space and incubator in San Francisco, is offering 2x/month complimentary access to members of AI safety orgs, and other causes we admire. Current partners of the Mox Guest Program include: * ARC * MIRI * Elicit * GovAI * FAR AI * Seldon Labs * Tarbell...
Last week, I joined “Revolutionist’s Night” — a cabal aiming to radically improve animal welfare, with some flavor of 1800s slavery abolitionism or the 1700s American revolution. Our topic for the evening was inspired by Aaron Boddy’s post “The current market price for animal welfare is zero”: can we create...
Each year, Manifund partners with regrantors: experts in the field of AI safety, each given an independent budget of $100k+. Regrantors can initiate fast, small grants, seeding early-stage projects with $5k-$50k. For 2025, we’ve raised $2.25m so far, and are excited to announce our first 10 regrantors: * Neel Nanda...
Hey! Austin here. At Manifund, I’ve spent a lot of time thinking about how to help AI go well. One question that bothered me: so much of the important work on AI is done in SF, so why are all the AI safety hubs in Berkeley? (I’d often consider this...
AI for Epistemics is about helping to leverage AI for better truthseeking mechanisms — at the level of individual users, the whole of society, or in transparent ways within the AI systems themselves. Manifund & Elicit recently hosted a hackathon to explore new projects in the space, with about 40...
What do I mean by “homegrown”? These projects are: * Local: Creators have a good track record in the EA or AI Safety community * Modest: The amount requested is not large; $5k would be meaningful * Overlooked: Not already backed by large institutional funders like OpenPhil If you’re a...