Calibration

Saul Munn (+37/-28)
Jim Fisher (+532/-362)
Jim Fisher
konstantin
chanamessinger (+55)
chanamessinger (+1057/-7)
brook (+413/-528)
Yoav Ravid
Yoav Ravid (+1315/-74)
Multicore (+100)

Exercises that are dead/unmaintained:

Someone is probability or credence well-calibrated if the things they predict with 70%X% chance of happening in fact occur 70%X% of the time. Importantly, calibration is not the same as accuracy. Calibration is about accurately assessing how good your predictions are, not making good predictions. Person A, whose predictions are marginally better than chance, e.g. 60%chance (60% of them come true,true when choosing from two options) and who knows that, is well-precisely 60% confident in their choices, is perfectly calibrated. In contrast, Person B, whose predictions arewho is 99% confident in their predictions, and right 90% accurate, yet thinks they are 99% accurate,of the time, is more accurate than Person A while beingA, but less well calibrated.well-calibrated.

Knowing how good your predictions are is a key rationalist skill.See also: Betting, Epistemic Modesty, Forecasting & Prediction

Being well-calibrated has value for rationalists separately from accuracy. Among other things, being well-calibrated lets you make good bets / make good decisions, communicate information helpfully to others if they know you to be well-calibrated (See Group Rationality), and helps prioritize which information is worth acquiring.

Note that calibration applies to all expressions of quantified confidence in beliefs/predictions [reference: Anticipate Experiences].beliefs can be well- or poorly- calibrated. For example, calibration applies to whether a person'person's 95% confidence intervals capture thingscaptures the true outcome 95% of the time. Or if their 80% chance of completion estimates are met 80% of the time (not much more or less). Trivially, odds ratio placed on things are convertible to probabilities.

See also: Forecasting & Prediction

DoSomeone is probability or credence calibrated if the events that you give athings they predict with 70% probabilitychance of happening in advance, actually end up happeningfact occur 70% of the time?time. Importantly, calibration is not the same as accuracy. Calibration is about accurately assessing how good your predictions are, not making good predictions. Person A, whose predictions are marginally better than chance, e.g. 60% of them come true, and who knows that, is well-calibrated. In contrast, Person B, whose predictions are 90% accurate, yet thinks they are 99% accurate, is more accurate than Person A while being less well calibrated.

Knowing how good your predictions are is a key rationalist skill. Among other things, being calibrated lets you make good bets / make good decisions, communicate information helpfully to others if they know you to be well-calibrated (See Group Rationality), and helps prioritize which information is worth acquiring.

Note that calibration applies to all expressions of quantified confidence in beliefs/predictions [reference: Anticipate Experiences]. For example, calibration applies to whether a person's 95% confidence intervals capture things 95% of the time. Or if their 80% chance of completion estimates are met 80% of the time (not much more or less). Trivially, odds ratio placed on things are convertible to probabilities.

See also: Forecasting & Prediction

Do the events that you give a 70% probability in advance, actually end up happening 70% of the time?