Piggyback question on this: why aren't LessWrongers finding and exploiting cognitive biases in markets in order to raise funds for their projects?

I realize that (a) it's really hard to do this or everyone would do it; and (b) there probably are individual LessWrongers working in finance. But to the extent that LW tends to think that entire fields of experts can be blind in their disciplines in ways disciplined rationalists are not (theologians, philosophers, doctors, politicians, educators, physicists), there would seem to be the prospect of some massively profitable arbitrage or prediction somewhere. And it's not like any of LessWrong's projects are allergic to funding.

My theory is that initially people who believe they can beat the experts in a variety of fields try to beat the experts at testable matters, which are the natural choice for someone wanting to demonstrate superiority or gain funding. At that point one of 3 things can happen: a: success that others recognize, b: re-calibration of self assessment, c: maintenance of the belief by change of the subject matters to non testable (those without strong feedback).

2wedrifid7yLarge well funded markets are smarter than lesswrongers. Experts with incentives that reward epistemic accuracy and have significant direct feedback from the universe can usually be assumed to be reliable. All else being equal this would lead us to trust index funds, be wary of managed funds and be sceptical of paid financial advice.

More "Stupid" Questions

by NancyLebovitz 1 min read31st Jul 2013498 comments


This is a thread where people can ask questions that they would ordinarily feel embarrassed for not knowing the answer to. The previous "stupid" questions thread went to over 800 comments in two and a half weeks, so I think it's time for a new one.