We could also say that if we took a probability distribution of the chances of every possible set of findings being true, the differential entropy of that distribution would be 0, as smart forecasters would recognize that inputs_i s correct with ~100% probability.

In that paragraph, did you mean to say "findings_i is correct"?

***

Neat idea. I'm also not sure whether the idea is valuable because it could be implementable, or from "this is interesting because it gets us better models".

In the first case, I'm not sure whether the correlation is strong enough to change any decisions. That is, I'm having trouble thinking of decisions for which I need to know the generalizability of something, and my best shot is measuring its predictability.

For example, in small foretold/metaculus communities, I'd imagine that miscellaneous factors like "is this question interesting enough to the top 10% of forecasters" will just make the path predictability -> differential entropy -> generalizability difficult to detect.

ozziegooen's Shortform

by ozziegooen 31st Aug 2019127 comments