This is my first Lesswrong comment - any feedback appreciated.
My quick takes (with a similar conflict: I'm doing AIS field-building).
I am inclined to agree with ~everything in this post.
I think the status dynamics are hard to overstate.
I know quite a few very competent builders / 'doers' who have bounced off EA/AIS.
And part of this is about the elevated status given to researchers, especially in contrast with the way 'operations' people (a catch-all used to encompass a large fraction of everything else) are treated.
The response I often hear, explicitly or in the undertones, is: 'But if they were really committed, they'd just do the thing that needs to be done.' So they are expected to ignore the status gradients and e.g. build a nonprofit that is illegible to those outside EA/AIS or a for-profit with a harder path to profitability.
Meanwhile, these communities are usually excited about people doing relevant-seeming AIS research, even when they might be doing so because it's interesting or high-status, rather than because it's the thing to do that has highest impact or is most important to AIS.
I think (c) is usually good - pain is not the unit of effort and people are usually more productive when they enjoy and feel valued for doing something.
But (b) and (c) together mean we hold people to a much higher standard for building than for researching. Unlike researchers, we expect builders to be really committed and fight against incentive gradients, rather than shifting the incentives. This is even though (low-confidence take) we might need the marginal builder more than the marginal 'equally-skilled' AIS researcher.
(Side note: In practice, I think the skills for these paths are fairly uncorrelated, such that this comparison is relevant for how we shape the field and prioritise people, but usually not whether a particular person does research or building.)
Two cruxes are as follows (hat tip Habryka - how do I tag him?):
How hard and/or necessary is it to manage a much larger AIS community / recruitment pipeline.
I think it might be necessary to massively grow the AIS community, in which case we just have to pay the 'trust penalty' of having some schemers try to get resources and status, making it harder to assess people and make progress in general.
It's harder to screen for qualities you want when many people are vying for jobs and might be Goodharting the criteria. But how much harder.
On the one hand, many companies in competitive industries face this problem and still seem to do fine (investment banks, top consulting firms, quant trading firms, etc.).
On the other hand, AIS often has bad feedback loops, such that it's harder to tell if someone is not really optimising the important thing (or even explicitly just focusing on looking good) - and success may be much harder.
I think I would change my view if I thought it was unnecessary and extremely hard to manage a much larger AIS community / recruitment pipeline.
I want to register a strong, although potentially unfair, gut reaction to Habryka's comment, which was something along the lines of: 'This feels like classic rationalisty ex-post justification for prioritising vibes over winning. A smaller community with people who are more value-aligned can feel easier, but AIS probably just needs to grow a lot. Maybe youdon't like having to figure out people's intentions and be in a world where people aren't so transparent. But maybe it's needed. This vaguely reminds me of EAs saying they need to hire other EAs when perhaps the real problem is that they're not good enough at management to manage non-EAs.'
How important is deep research understanding to building successful AIS orgs?
If you need strong research understanding to build useful orgs, maybe we should prioritise people to do research first. Then the question would be how to shift incentives to move people out of pure research and into org-building later.
Empirically, the most successful - or at least influential - AIS orgs have been built by people with strong research understanding.
My guess is that we're underusing non-researcher/researcher teams, and also that there might be some stuff (e.g. in biosecurity) that people with very little research background can build successfully. There's also probably a tradeoff ((2(e)(i)) between great researchers and great builders, and we need more than just research orgs.
This is my first Lesswrong comment - any feedback appreciated.
My quick takes (with a similar conflict: I'm doing AIS field-building).