Perhaps other employers should also employ everyone half-time so that they get more information about their employees' market value?

If SIAI were paying Eliezer to be a "generic" programmer, then I suppose they could get a reasonable idea of whether he's a good one in the way you describe. Or they could just fire him and hire some other guy for the same salary: that's not a bad way of getting (where SIAI is) a middling-competent programmer for hire.

But it doesn't seem to me even slightly credible that that's what they're paying Eliezer for. They might want him writing AI software -- or not, since he's well known to think that writing an AI system is immensely dangerous -- in which case sending him out to work half-time for some random software company isn't going to give much idea of how good he is at that. Or they might want him Thinking Deep Thoughts about rationality and friendly AI and machine ethics and so forth, in which case (1) his "market value" would need to be assessed by comparing with professional philosophers and (2) presumably SIAI sees the value of his work in terms of things like reducing existential risk, which the philosophy-professor market is likely to be ... not very responsive to.

What sending Eliezer out to work half-time commercially demonstrably won't do is to measure his "effectiveness" at anything that seems at all likely to be what SIAI thinks it's worth paying him $100k/year for.

The most likely effects seem to me some combination of: (1) Eliezer spends less time on SIAI stuff and is less useful to SIAI. (2) Eliezer spends all his time on SIAI stuff and gets fired from his other job. (3) Eliezer finds that he can make a lot more money outside SIAI and jumps ship or demands a big pay rise from SIAI. (4) Eliezer decides that an organization that would do something so obviously silly is not fit to (as he sees it) try to decide the fate of the universe, quits SIAI, and goes to do his AI-related work elsewhere.

No combination of these seems like a very good outcome. What's the possible benefit for SIAI here? That with some (not very large) probability Eliezer turns out not to be a very good programmer, doesn't get paid very well by the commercial half-time gig, accepts a lower salary from SIAI on the grounds that he obviously isn't so good at what he does after all, but doesn't simultaneously get so demoralized as to reduce his effectiveness at what he does for SIAI? Well, I suppose it's barely possible, but it doesn't seem like something worth aiming for.

What am I missing here? What halfway plausible way is there for this to work out well?

I think it's entirely possible for people within corporations to build cozy empires and argue that they should be paid well, and for those same people to in fact be incompetent at value creation - that is, they could be zero-sum internal-politics specialists. The corporation would benefit from enforcing a policy against this sort of "employee lock-in", just like corporations now have policies against "supplier lock-in".

This would entail, among other things, everyone within the corporation having a job description that is sufficiently g... (read more)

SIAI Fundraising

by [anonymous] 6 min read26th Apr 2011120 comments

59


Please refer to the updated documented here: http://lesswrong.com/lw/5il/siai_an_examination/

This version is an old draft.

 

NOTE: Analysis here will be updated as people point out errors! I've tried to be accurate, but this is my first time looking at these (somewhat hairy) non-profit tax documents. Errors will be corrected as soon as I know of them! Please double check and criticize this work that it might improve.

Document History:

  • 4/25/2011 - Initial post.
  • 4/25/2011 - Corrected Yudkowsky compensation data.
  • 4/26/2011 - Added expanded data from 2002 - 2009 in Overview, Revenue, and Expenses
  • 4/27/2011 - Added expanded data to Officer Compensation & Big Donors

Todo:

  • Create a detailed program services analysis that examines the SIAI's allocation of funds to the Summit, etc.
  • Create an index of organizational milestones.

Disclaimer:

  • I am not affiliated with the SIAI.
  • I have not donated to the SIAI prior to writing this.

Acting on gwern's suggestion in his Girl Scout Cookie analysis, here is a first pass at looking at SIAI funding, suggestions for a funding task-force, etc.

The SIAI's Form 990's are available at GuideStar and Foundation Center. You must register in order to access the files at GuideStar.

Work is being done in this Google Spreadsheet.

Overview

Notes:
Sometimes the listed end of year balances didn't match what the spreadsheet calculated:
  • Filing Error 1? - There appears to be a minor typo to the effect of $4.86 in the end of year balance for the 2004 document. This money is accounted for, the results just aren't entered correctly. * Someone else please verify.
  • Filing Error 2? - The 2005 document appears to have accounted for expenses incorrectly, resulting in an excess $70,179.00 reported in the end of the year asset balance. This money is accounted for under 2005 Part III. It is merely not correctly deducted from the year end asset balance. * Someone else please verify.
  • Theft? - The organization reported $118,803.00 in theft in 2009 resulting in a year end asset balance lower than expected. The SIAI is currently pursuing legal restitution.

Analysis:

  • The SIAI asset sheet grew until 2008 when expenditures outpaced revenue.
  • Assets would have resumed growth into 2009, except for theft (see above.)
  • Current asset balance is insufficient to sustain a year of operation at existing rate of expenditure. Significant loss of revenue would result in a shrinkage of services. Such a loss of revenue may be unlikely, but a reasonable goal would be to build up a year's reserves.

Revenue

Revenue is composed of public support, program service (events/conferences held, etc), and investment interest. The "Other" category tends to include Amazon.com affiliate income, etc.

Analysis:

  • Income from public support (donations) has grown steadily with a significant regular increase starting in 2006.
  • This regular increase is a result of significant new contributions from big donors.
    • As an example, public support in 2007 is largely composed of significant contributions from Peter Thiel ($125k), Brian Cartmell ($75k), and Robert F. Zahra Jr ($123k) for $323k total in large scale individual contributions (break down below).
  • In 2007 the SIAI started receiving income from program services. Currently all "Program Service" revenue is from operation of the Singularity Summit.
  • The Singularity Summit revenue continues to grow. The Summit is roughly breaking even. If this continues, the Summit will be able to compensate speakers better, improve the quality of proceedings, or net some of the revenue for other goals.

Expenses

Expenses are composed of grants, benefits, salaries & compensation, contracts, travel, program services, and an other category (mostly administrative costs and usually itemized, check the source data).
The contracts column in the chart below includes legal and accounting fees. Check the source data.

Analysis:

  • This chart can use improvement. It's categorized rather clinically. Would be more useful to break down the Contracts and Program categories (this may not be possible from the Form 990s).
  • The grants in 2002, 2003, and 2004 were paid to Eliezer Yudkowsky for work "of unique relevance and value to the Singularity, to Artificial Intelligence, or to Friendly Artificial Intelligence."
  • Program expenses include operating the Singularity Summit, Visiting Fellows Program, etc.
  • The Other category includes lots of administrative costs that are somewhat itemized.
  • Overall, expenses have grown at pace with revenue.
    • Salaries have steadily declined. (More detail below.)
    • Program service expenses have increased, but this is expected as the Singularity Summit has grown and new services like the Visiting Fellows Program have been introduced.

Big Donors

Analysis

  • Contributions in the 2010 column are derived from http://singinst.org/donors. Contributions of less than $5,000 are excluded for the sake of brevity.
  • Contributions in 2003 - 2009 are from official filings. The 2009 Form 990 discloses excess donations for 2006 - 2009. This is not an exhaustive list of contributions, just what could be found in the Form 990s available online.
  • The 2006 donation from Peter Thiel is sourced from a discussion with the SIAI.
  • Peter Thiel and a few other big donors compose the bulk of the organization's revenue.
    • Should any major donor be lost, the SIAI would have to reduce services. It would be good to see a broader base of donations moving forward.
    • Note, however, that over the past five years the base of donations HAS been improving. We don't have the 2010 Form 990 yet, but just based on data from MA and SingInst.com things are looking a lot better.

Officer Compensation

Analysis:
  • This graph needs further work to reflect the duration of officers' service.
  • In 2002 to 2005 Eliezer Yudkowsky received compensation in the form of grants from the SIAI for AI research.
  • Starting in 2006 all compensation for key officers is reported as salaried instead of in the form of grants.
  • SIAI officer compensation has decreased in recent years.
  • Eliezer's base compensation as salary increased 20% in 2008 and then 7.8% in 2009.
    • It seems reasonable to compare Eliezer's salary with that of professional software developers. Eliezer would be able to make a fair amount more working in private industry as a software developer.
  • Both Yudkowsky and Vassar report working 60 hours a work week.
  • It isn't indicated how the SIAI conducts performance reviews and salary adjustment evaluations.
Further Editorial Thoughts...

Prior to doing this investigation, I had some expectation that the Singularity Summit was a money losing operation. I had an expectation that Eliezer probably made around $70k (programmer money discounted for being paid by a non-profit). I figured the SIAI had a broader donor base. I was off base on all counts.* I am not currently an SIAI supporter. My findings have greatly increased the probability that I will donate in the future. 

Overall, the allocation of funds strikes me as highly efficient. I don't know exactly how much the SIAI is spending on food and fancy tablecloths at the Singularity Summit, but I don't think I care: it's growing and it's nearly breaking even. An attendee can have a very confident expectation that their fee covers their cost to the organization. If you go and contribute you add pure value by your attendance.

At the same time, the organization has been able to expand services without draining the coffers. A donor can hold a strong expectation that the bulk of their donation will go toward actual work in the form of salaries for working personnel or events like the Visiting Fellows Program.

Eliezer's compensation is slightly more than I thought. I'm not sure what upper bound I would have balked at or would balk at. I do have some concern about the cost of recruiting additional Research Fellows. The cost of additional RFs has to be weighed against new programs like Visiting Fellows.

The organization appears to be managing its cash reserves well. It would be good to see the SIAI build up some asset reserves so that it could operate comfortably in years were public support dips or so that it could take advantage of unexpected opportunities.

The organization has a heavy reliance on major donor support. I would expect the 2010 filing to reveal a broadening of revenue and continued expansion of services, but I do not expect the organization to have become independent of big donor support. Things are much improved from 2006 and without the initial support from Peter Thiel the SIAI would not be able to provide the services it has, but it would still be good to see the SIAI operating capacity be larger than any one donor's annual contribution. It is important for Less Wrong to begin a discussion of broadening SIAI revenue sources.

Where to Start?

There is low hanging fruit to be found. The SIAI's annual revenue is well within the range of our ability to effect significant impact. These suggestions aren't all equal in their promise, they are just things that come to my mind.

  • Grant Writing. I don't know a lot about it. Presumably a Less Wrong task force could investigate likely candidate grants, research proper grant writing methodology, and then apply for the grants. Academic members of Less Wrong who have applied for research grants would already have expertise in this area.
  • Software. There are a lot of programmers on Less Wrong. A task force could develop an application and donate the revenue to the SIAI.
  • Encouraging Donations. Expanding the base of donations is valuable. The SIAI is heavily dependent on donations from Peter Thiel. A task force could focus on methods of encouraging donations from new supporters big and small.
  • Prize Winning. There are prizes out there to be won. A Less Wrong task force could identify a prize and then coordinate a group to work towards winning it.
  • Crowd Source Utilization. There are sites devoted to crowd sourced funding for projects. A task force could conceive of a project with the potential to generate more revenue than required to build it. Risk could be reduced through the use of crowd sourcing. Excess revenue donated to the SIAI. (Projects don't have to be software, they could be fabricating an interesting device, piece of art, or music.)
  • General Fund Raising Research. There are a lot of charities in the world. Presumably there are documented methods for growing them. A task force could attack this material and identify low hanging fruit or synthesize new techniques.
I have more specific thoughts, but I want to chew on them a bit.

59