What is fiscal sponsorship?

It’s fairly common for EA orgs to provide fiscal sponsorship to other EA orgs.  Wait, no, that sentence is not quite right. The more accurate sentence is that there are very few EA organizations, in the legal sense; most of what you think of as orgs are projects that are legally hosted by a single org, and which governments therefore consider to be one legal entity. 

The king umbrella is Effective Ventures Foundation, which hosts CEA, 80k, Longview, EA Funds, Giving What We Can, Asterisk magazine, Centre for Governance of AI, Forethought Foundation, Non-Trivial, and BlueDot Impact. Posts on the castle also describe it as an EVF project, although it’s not listed on the website. Rethink Priorities has a program specifically to provide sponsorship to groups that need it. LessWrong/Lightcone is hosted by CFAR, and have sponsored at least one project themselves (source: me. It was my project).

Fiscal sponsorship has a number of advantages. It gets you the privileges of being a registered non-profit (501c3 in the US) without the time-consuming and expensive paperwork. That’s a big deal if the project is small, time-limited (like mine was) or is an experiment you might abandon if you don’t see results in four months.  Even for large projects/~orgs, sharing a formal legal structure makes it easier to share resources like HR departments and accountants. In the short term, forming a legally independent organization seems like a lot of money and effort for the privilege of doing more paperwork. 


 

The downsides of fiscal sponsorship

…are numerous, and grow as the projects involved do.

The public is rightly suspicious about projects that share a legal entity claiming to be independent, so bad PR for one risks splash damage for all. The government is very confident in its belief that you are the same legal entity, so legal risks are shared almost equally (iamnotalawyer). So sharing a legal structure automatically shares risk. That may be fixable, but the fix comes at its own cost.

The easiest thing to do is just take fewer risks. Don’t buy retreat centers that could be described as lavish. And absolutely, 100%, don’t voluntarily share any information about your interactions with FTX, especially if the benefits to doing so are intangible. So some amount of value is lost because the risk was worth it for an individual or small org, but not to the collective.

[it is killing me that I couldn’t follow the rule of three with that list, but it turns out there aren’t that many legible, publicly visible examples of decisions to not share information] 

And then there are the coordination costs. Even if everyone in the legal org is okay with a particular risk, you now have an obligation to check with them. The answer is often “it’s complicated”, which leads to negotiations eating a lot of attention over things no one cares that much about. Even if there is some action everyone is comfortable with, you may not find it because it’s too much work to negotiate between that many people (if you know anyone who lived in a group house during covid: remember how fun it was to negotiate safety rules between 6 people with different value functions and risk tolerances?). 

Chilling effects

A long, complicated (but nonetheless simplified)example

The original version of this story was one paragraph long. It went something like: A leader at an EVF-sponsored project wanted to share some thoughts on a controversial issue, informally but in public. The comments were not riskless, but this person would happily have taken the risk if it affected only themselves or their organization. Someone at EVF said no. Boo, grrr.

I sent that version to the source to check for accuracy. They gave me a new, more complicated story. Maybe it was good they never published those comments, because they were coming from an angry place. Maybe it was good they never published the initial version of those comments, but bad they didn’t publish a draft revised after a good night’s sleep. Maybe it’s not fair to blame EVF, since they (commenter) gave up pretty quickly and maybe EVF would have said yes if they’d kept pushing. Maybe that’s all an excuse, and those comments were great and it was a tragedy they were lost…

It became clear there was no way to portray the story with the level of nuance the source wanted, without giving enough details to totally blow their anonymity. I offered to let them write out the whole thing in their words with their name on it. They considered it but didn’t feel able to do so without checking with their colleagues, which would have further delayed things and eaten up multiple people’s time. Especially because it would probably not have been a quick yes or no, it would have been more rounds of negotiation between people, all of whom were busy and didn’t hold this as a priority…

I told them not to bother, because it was unnecessary. The fact that it is so difficult to share enough information to even figure out if the comments were net positive or negative, and how that would change if projects didn’t share fiscal sponsorship, is already on display. So I wrote up this story of trying to share the  original example. 


 

Other examples

As reported by Oliver Habryka: Will MacAskill has written up reflections on the SBF debacle, but EVF told him not to publish.

Luke Freeman, Executive Director at Giving What We Can, said that the EVF board ordered a cessation of the GWWC pledge drive in the wake of the FTX explosion, and explicitly ascribed this to the EVF board making a conservative rule and not having time to review exceptions.

I object to this way less than to the censorship; not fundraising in the immediate wake of FTX seems like a pretty reasonable decision. But I expect the factors he brings up in defense of this decision, risk to sister projects and bandwidth limitations, to be systemic.

Confusion tolerance

There’s another issue with fiscal sponsorship. I think it’s minor compared to the chilling effect on risk taking, but still worth mentioning. One side effect of sharing a legal structure is that people doing business with project P (e.g. 80k, or Longview) will receive checks, invoices, or other paperwork that uses the name of sponsoring organization O (e.g. EVF). This might look sketchy at first, but then someone explains fiscal sponsorship to you and you accept it.

Which is why it didn’t raise any alarm bells for me when my first check from “the FTX Future Fund (regrantor program)” came via CEA, and the second used the name North Dimensions.  I've gotten lots of checks that didn't match the organization's name, so I mumbled something about EA’s lack of professionalism and moved on with my day. What has come out since is that North Dimensions was a pre-existing company FTX bought in order to use its bank account, and that bank account was shared between FTX and Alameda in ways that have to have been inappropriate. 

[Note: I haven’t attempted to nail down the details of that bank account or my grants and may have gotten something wrong. I don’t think any individual error would contradict my claim that training people to accept misdirection creates cover for malfeasance. The fact that the situation breeds errors is the point.]


Conclusion

I think EA should grapple with the risk creation and risk aversion caused by fiscal sponsorship, especially umbrella orgs, and how those trade-off against the benefits. This is hard because the benefits of sponsorship are legible and predictable, and the costs are nebulous and erratic. But that makes it all the more important to deliberately investigate them. My guess is that this will show that having multiple large orgs share a legal structure is not worth it, but using sponsorship for short term projects or a launching pad will continue to make sense. Maybe I’m wrong though, we can’t know until we check. 


 

New Comment
17 comments, sorted by Click to highlight new comments since: Today at 7:50 AM

To add to your list at the start: AI Impacts is legally MIRI. And OpenPhil was legally GiveWell for 6 years.

Constellation used to legally be part of Redwood Research. (I believe this is no longer true or will soon no longer be true?)

AI Impacts is legally MIRI

As far as I know, AI Impacts does not seem to have any non-trivial positive impact on either the epistemics of the AI x-risk community, nor seems to have helped with any governance efforts.

Probably the best output by people seemingly affiliated with AI Impacts, that I have encountered, is Zach's sequence on Slowing AI.

On the other hand, here are two examples (one, two) that immediately came to mind when I thought of AI Impacts. These two essays describe a view of reality that seems utterly pre-Sequences to me. There's this idea that there's something inherently unpredictable about reality caused by chaos dynamics in complex systems that limits the sorts of capabilities that a superintelligence can have, and such an argument seems to imply that one should worry less about the possibility of superintelligent AGI systems ending humanity.

It seems like some of AI Impacts' research output goes against the very fundamental understanding that underpins why the creation of unaligned AGI is an extinction risk.

Is AI Impacts being funded by MIRI? I sure hope not.

Re. governance efforts, you might be forgetting the quite impactful 2016 and 2022 AI impacts survey: https://wiki.aiimpacts.org/doku.php?id=ai_timelines:predictions_of_human-level_ai_timelines:ai_timeline_surveys:2022_expert_survey_on_progress_in_ai 

These both I think had a pretty large effect on a bunch of governance and coordination work. 

Another post that had a pretty major impact was Katja's post "Let's Think about Slowing Down AI", which had a relatively large effect on me and many others.

I also think the discontinuity studies have been quite helpful and I've frequently references them when talking about takeoff dynamics. 

I don't believe AI Impacts is being funded by MIRI; we're talking about fiscal sponsorship here, which means that they're the same legal entity in the eyes of the government, and share accounting, but typically (and as it looks to me in this case) they do their fundraising entirely separately and do not work together.

On their output, I'll mention that I found Katja's post Discontinuous progress in history: an update quite interesting when thinking about how quickly things change historically, and I think Let’s think about slowing down AI helped get people thinking seriously about that topic. 

I think my main critique would be that their total output is fairly small (i.e. a fair bit less than I would have predicted 6-7 years ago).

I will keep harping on that more people should try starting (public benefit) corporations instead of nonprofits. At least, give it five minutes' thought. Especially if handwaves impact markets something something. This should be in their Overton Window, but it might not be because they automatically assume "doing good => charity => nonprofit". Corporations are the standard procedure for how effective helpful things are done in the world; they are RLHF'd by the need to acquire profit by providing real value to customers, reducing surfacce area for bullshitting. I am not an expert here by any means, but I'm noticing the fact that I can go on Clerky or Stripe Atlas and spend a couple hours spinning up an organization, versus, well, I haven't actually gone through with trying to incorporate a nonprofit, but the process seems at least 10x more painful based on reading a book on it and with how many people seek fiscal sponsorship. I'm pretty surprised this schlep isn't talked about more. Having to rely on fiscal sponsorship seems pretty obviously terrible to me, and I hadn't even considered the information-distortive effects here. I would not be caught dead being financially enmeshed with the EVF umbrella of orgs after FTX. From my naive perspective, the castle could have easily been a separate business entity with EVF having at least majority control?

(I just realized I'm on LessWrong and not EA Forum, and could have leaned harder into capitalismpunk without losing as many social points.)

My chief guess for why this happens is people don't realize it's an option or understand the distinction, and it isn't in their skillset or area of interest so they don't dig deep enough to find out.

Actually, wow, that "people" sure sounds like I'm talking about someone else. Hi, I personally didn't have the phrase "public benefit corporation" cached in my head and I'm not actually sure what the distinction between that and a nonprofit is. That's not because it's totally irrelevant to my interests either. I've talked with two or three people over the last year specifically seeking advice on how to set up the legal structure for an organization that wasn't aiming to make a profit, and "public benefit corporation" isn't in my notes from any of those conversations. These weren't random people either! One was a director of a non-profit and the other was an (off the clock) lawyer!

And I think I'm unusually interested in organizational structure for someone in this space. There's a kind of corollary to Being The (Pareto) Best in the World, where you start to see that there are (for example) incredibly talented biologists all over, highly skilled statisticians available if you know where to look, but a comparatively far smaller number of expert biologist-statisticians. Adroitness with the intricacies of bureaucratic organizational structure is a third skill. Stripe Atlas has put some serious work into making the process of creating an organization easy and painless for people fluent in Internet because it's common for someone with the WebDesign-ViableBusiness skill overlap to not have put any points in Organizational Bureaucracy. If you want a biologist-statistician-bureaucrat, you are looking for what's actually a pretty narrow slice of the population! If a biologist-statistician (or any other fertile combination of non-bureaucrat-adjacent skills) wanted to do some biology-statistics and someone else offered to handle the organizational backend, I absolutely understand why they might jump at it!

(Over the last year, I've started to regard Being The (Pareto) Best in the World in much the same way that one regards a dread prophecy of doom, spoken from a rent in the air above some sulphurous chasm. The blindingly obvious gaps I see in the world are often at the intersections of three different skillsets.)

Lest I sound like I have zero suggestions: Do you recommend something for people to read if they want to do a quick bit of upskilling here?

they are RLHF'd by the need to acquire profit by providing real value to customers

this is exactly why I would disagree with this suggestion.

do you know any references people could use to investigate more?

Forming a nonprofit is not that difficult. It's like four extra hours of work to get the 501c3 status and a decent time delay of several months. Having someone else to fill out the 990 for you is nice, though!

I cheerfully believe it's not actually that difficult if you know what you're doing. I think building a website isn't that difficult and doing a forward roll isn't that difficult and baking a loaf of bread isn't that difficult, and some people find those activities hard.

If you happen to know of a well written, straightforward guide you'd like to point me at, maybe one with some explainers about what various options mean and what the tradeoffs are compared to alternative structures, I and possibly other readers could probably benefit from a link!

I think another disadvantage that the post doesn't mention is that being an Organization confers a sense of legitimacy, in part because it acts as a costly signal that one can manage all the legal complexities, pay for paperwork, etc.; so being a sponsored project is a way to call yourself an Organization without paying the cost of the signal. In the limit of widespread knowledge about the legal subtleties of fiscal sponsorship, and due diligence on the part of the community to always keep the distinction in mind, this is not a problem. But people are lazy, and so this sort of confusion does I think benefit sponsored projects.

Another effect I'm very concerned about is the unseen effect on the funding landscape. For all EVF organisations are said to be financially independent, none of them seem to have had any issue getting funding, primarily from Open Phil (generally offering market rate or better salaries and in some cases getting millions of dollars on marketing alone), while many other EA orgs - and, contra the OP, there many more* outside the EVF/RP net than within - have struggled to get enough money to pay a couple of staff a living wage.

* That list excludes regional EA subgroups, of which there are dozens, and would no doubt be more if a small amount of funding was available.

Huh, I never heard of this umbrella Effective Ventures Foundation before. Let alone about its ability to muzzle individual speech.

EVF used to be known as Centre for Effective Altrusim. They attempted to maintain a distinction between the umbrella group and CEA-as-defined-by-CEA's-website, but it was a disaster and they properly changed it. 

The LessWrong Review runs every year to select the posts that have most stood the test of time. This post is not yet eligible for review, but will be at the end of 2024. The top fifty or so posts are featured prominently on the site throughout the year. Will this post make the top fifty?