Title: [SEQ RERUN] Hindsight Bias

Tags: sequence_reruns

Today's post, Hindsight Bias was originally published on 16 August 2007. A summary (taken from the LW wiki):

Hindsight bias makes us overestimate how well our model could have predicted a known outcome. We underestimate the cost of avoiding a known bad outcome, because we forget that many other equally severe outcomes seemed as probable at the time. Hindsight bias distorts the testing of our models by observation, making us think that our models are better than they really are.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was One Argument Against an Army, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 2:27 AM

One thing I've been wondering about in regards to hindsight bias is whether or not this is connected to the more extreme phenomenon with young children where they erroneously report their own prior beliefs. See for example Gopnik & Astington's 1988 paper in which children of around four years old are given a box that normally contains some object (say a pencil box) and are then shown what was actually in the box (some other thing such as candy). A large fraction of children in a certain age range will when asked what they thought was in the box will then say "candy" even as they clearly acted surprised. Moreover, in similar experiments children who fail at the task are much more likely when asked what other children will think is in the box to answer candy rather than pencils. So it almost seems like hindsight bias is a similar but less extreme example of the same thing which amounts to a combination of the illusion of transparency and failure to have an accurate theory of mind. I don't know the literature about hindsight bias enough to know if this point has been suggested before, but I suspect it has been.

Some scholars share this intuition apparently. Googling "'hindsight bias' gopnik" seems to come up with a number of related papers.