Has anyone tried to make a sort of standardized test for collaborative rationality/coordination skills (double-crux, ITT, things like in this post)? It seems to me like that + a sort of badges system on a browser extension[1] solves problems where both people would coordinate and use a better strategy if they knew the other person was able to.
[1] allows for use on more than just this site, also I don't know how I would feel about this being an Official LW Thing
Law of the Non-Player-Character:
If you have an interesting idea or project that you probably won't do, write it somewhere! It still might not happen, but it's better than if you kept it a secret.
LessWrong appears to sit at the Pareto-frontier of truthseeking, calibration, and clear writing about facts we've discovered. This is a mix of investigative journalist and quant traits. If you attach a hedge fund to an investigative journal, it becomes profitable to write based on how accurate the information is relative to the current consensus. We seem fairly funding constrained as a community, and this seems like an opportunity to profitably raise the sanity waterline. AFAIK rationalists haven't done this, why? FTX?
That journal investigates things in ways that produce immediate changes in the prices of easily tradeable assets. Lesswrong doesn't.
It often does investigate things in such a way that will produce or predict eventual changes in an ambiguous basket of assets over a long timespan, but it's hard to make much money that way, you can't compound your returns.
Hmm, I think I misphrased my OP, I agree that a lot of what we write about wouldn't produce those sorts of immediate changes. I meant more that this is a rather profitable business model that I expect some of us to be very well-suited to, which also happens to work towards one of the core goals of our community. I also separately expect that we spend far less time than we should finding ways to bet on the seemingly ambiguous things, but I also don't think that working out is necessary for a profitable LW investigative journal. It seems plausible that the people with the necessary skillsets are currently busy with more valuable things, but I'm a little suspicious of that argument because of the aforementioned lack of funding.
Why would lesswrong control this, though? I think it would look more like some lesswrong users being part of an investing firm, and donating some of their winnings to lightcone, which is possibly already happening
Pareto-frontier
You mean the shapley value (edit: unsure, it's something in that genre, it might not have had a name, it might have been something like "wherever the diagonal intersects the convex pareto front"). The pareto frontier is the entire tradeoff space, which encompasses the shapley value, and it also has many points where eg, clear writing is high but truthseeking and calibration are zero.