Wiki Contributions

Comments

I am donating half my net worth over the next few years towards efforts to make it more likely that we end up in one of the better futures (focusing on highly frugal people, as they are more neglected by major grantmakers and my funds will go further), alongside most of my productive energies.

If anyone wants to commit to joining the effort, please reach out. There are many low hanging fruit in the alignment ecosystem space.

  • There's also the new Alignment Ecosystem Slack, but that's invite only currently. From the tag: "If you'd like to join message plex with an overview of your involvement."
  • I found a great designer/programmer for one of my alignment projects on the EA Creatives and Communicators Slack.
  • Impact Markets is somewhat relevant.

I am getting ready to help launch two more in the next couple of weeks, one for alignment grantmakers (gated to people who can verify they've directed $10k+ towards alignment), one for software engineers who want to help with alignment. They're not polished yet so not ready for a big announcement, but feel free to turn up early if you're sold on the idea already.

(also, neat, I got cited!)

I'm hopeful that most people would see the difference between "rich people trying to save their own skin" and "allowing researchers who are trying to make sure humanity has a long-term future at all to continue their work", but I would be very happy to have leads on who to talk to about presenting this well.

Rob's is invite-only right now, but I have an invite and sent you it. I'm adding links to the two public ones to the main post, the investing one is private and I don't have any invites, but they're considering switching to an open invite next week so I'll get back to you.

For how to get on the radar of grantmakers, I think applying to microgrants programs like ACX+ or LTFF is a reasonable start (one of my grantees was from there ACX+). Another idea which should open up soon with impact certificates is just going ahead and doing something of value then selling your impact. Talking to AI Safety Support is also a near-universally good thing to do for people who want to help. I'd also be open to hearing more about your situation in DM.

A lot of that makes sense, other than I don't understand the angle-grating part?