A basic issue with a lot of deliberate philanthropy is the tension between:
The kneejerk solution I'd propose is "proof of novel work". If you want funding to do X, you should show that you've done something to address X that others haven't done. That could be a detailed insightful write-up (which indicates serious thinking / fact-finding); that could be some you did on the side, which isn't necessarily conceptually novel but is useful work on X that others were not doing; etc.
I assume that this is an obvious / not new idea, so I'm curious where it doesn't work. Also curious what else has been tried. (E.g. many organizations do "don't apply, we only give to {our friends, people we find through our own searches, people who are already getting funding, ...}".)
So let me jump in and say, I've been on Less Wrong since it started, and engaged with topics like transhumanism, saving the world, and the nature of reality, since before 2000; and to the best of my recollection, I have never received any serious EA or rationalist or other type of funding, despite occasionally appealing for it. So for anyone worried about being corrupted by money: if I can avoid it so comprehensively, you can do it too! (The most important qualities required for this outcome may be a sense of urgency and a sense of what's important.)
Slightly more seriously, if there is anyone out there who cares about topics like fundamental ontology, superalignment, and theoretical or meta-theoretical progress in a context of short timelines, and who wishes to fund it, or who has ideas about how it might be funded, I'm all ears. By now I'm used to having zero support of that kind, and certainly I'm not alone out here, but I do suspect there are substantial lost opportunities involved in the way things have turned out.
This is crossposted from the EA Forums because I expect similar (but weaker) dynamics to impact the rationality community.
People working in the AI industry are making stupid amounts of money, and word on the street is that Anthropic is going to have some sort of liquidity event soon (for example possibly IPOing sometime next year). A lot of people working in AI are familiar with EA, and are intending to direct donations our way (if they haven't started already). People are starting to discuss what this might mean for their own personal donations and for the ecosystem, and this is encouraging to see.
It also has me thinking about 2022. Immediately before the FTX collapse, we were just starting to reckon, as a community, with the pretty significant vibe shift in EA that came from having a lot more money to throw around.
CitizenTen, in "The Vultures Are Circling" (April 2022), puts it this way:
Other highly upvoted posts from that era:
I wish FTX hadn't done fraud and collapsed for many reasons, but one feels especially salient currently: we never finished processing how abundant funding impacts a high-trust altruistic community. The conversation had barely started.
I would say that I'm worried about these dynamics emerging again, but there's something a little more complicated here. Ozy actually calls out a similar strand of dysfunction in (parts of) EA in early 2024:
So these dynamics are not "emerging again". They haven't left. And I'm worried that they might get turbocharged when money comes knocking again.