You are viewing revision 1.2.0, last edited by Yoav Ravid

Someone is probability or credence calibrated if the things they predict with 70% chance of happening in fact occur 70% of the time. Importantly, calibration is not the same as accuracy. Calibration is about accurately assessing how good your predictions are, not making good predictions. Person A, whose predictions are marginally better than chance, e.g. 60% of them come true, and who knows that, is well-calibrated. In contrast, Person B, whose predictions are 90% accurate, yet thinks they are 99% accurate, is more accurate than Person A while being less well calibrated.

Knowing how good your predictions are is a key rationalist skill. Among other things, being calibrated lets you make good bets / make good decisions, communicate information helpfully to others if they know you to be well-calibrated (See Group Rationality), and helps prioritize which information is worth acquiring.

Note that calibration applies to all expressions of quantified confidence in beliefs/predictions [reference: Anticipate Experiences]. For example, calibration applies to whether a person's 95% confidence intervals capture things 95% of the time. Or if their 80% chance of completion estimates are met 80% of the time (not much more or less). Trivially, odds ratio placed on things are convertible to probabilities....

(Read More)