Someone is well-calibrated if the things they predict with X% chance of happening in fact occur X% of the time. Importantly, calibration is not the same as accuracy. Calibration is about accurately assessing how good your predictions are, not making good predictions. Person A, whose predictions are marginally better than chance (60% of them come true when choosing from two options) and who is precisely 60% confident in their choices, is perfectly calibrated. In contrast, Person B, who is 99% confident in their predictions, and right 90% of the time, is more accurate than Person A, but less well-calibrated.
See also: , ,
Being well-calibrated has value for rationalists separately from accuracy. Among other things, being well-calibrated lets you make good / , communicate information helpfully to others if they know you to be well-calibrated (See ), and helps prioritize .
Note that all expressions of quantified confidence in can be well- or poorly- calibrated. For example, calibration applies to whether a person's 95% confidence intervals captures the true outcome 95% of the time.
List of Calibration Exercises
based on this post. Todo: find more & sort & new post for visibility in search engines?
Exercises that are dead/unmaintained