As used here, the term "bias" or "cognitive bias" refers to a specific, predictable error pattern in the human mind. The heuristics and biases program in cognitive psychology has documented hundreds of reproducible errors - often huge reproducible errors. From these errors in turn we can infer the specific reasoning mechanisms that give rise to the specific errors, the "heuristics". This continues to be a highly active area of investigation in cognitive psychology.
Understanding cognitive biases and trying to defend against their effects has been a basic theme of Less Wrong since the days it was part of Overcoming Bias.
===Wiki category and blog tag:===
Starting points:
Blog posts on the concept of "bias":
Blog posts about known cognitive biases:
- Scope Insensitivity — The human brain can't represent large quantities: an environmental measure that will save 200,000 birds doesn't conjure anywhere near a hundred times the emotional impact and willingness-to-pay of a measure that would save 2,000 birds.
- Correspondence Bias, also known as the fundamental attribution error, refers to the tendency to attribute the behavior of others to intrinsic dispositions, while excusing one's own behavior as the result of circumstance.
- Confirmation bias, or Positive Bias is the tendency to look for evidence that confirms a hypothesis, rather than disconfirming evidence.
- Hindsight Bias describes the tendency to seem much more likely in hindsight than could have been predicted beforehand.
- Planning Fallacy — We tend to plan envisioning that everything will go as expected. Even assuming that such an estimate is accurate conditional on everything going as expected, things will not go as expected. As a result, we routinely see outcomes worse then the ex ante worst case scenario.
- Conjunction Fallacy — Elementary probability theory tells us that the probability of one thing (we write P(A)) is necessarily greater than or equal to the conjunction of that thing and another thing (write P(A&B)). However, in the psychology lab, subjects' judgments do not conform to this rule. This is not an isolated artifact of a particular study design. Debiasing won't be as simple as practicing specific questions, it requires certain general habits of thought.
- We Change Our Minds Less Often Than We Think — we all change our minds occasionally, but we don't constantly, honestly reevaluate every decision and course of action. Once you think you believe something, the chances are good that you already do, for better or worse.
- Priming and Contamination — Even slight exposure to a stimulus is enough to change the outcome of a decision or estimate. See also Never Leave Your Room by Yvain, and Cached Selves by Salamon and Rayhawk.
- Do We Believe Everything We're Told? — Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.
- Illusion of Transparency — Everyone knows what their own words mean, but experiments have confirmed that we systematically overestimate how much sense we are making to others.
- Self-Anchoring — Related to contamination and the illusion of transparancy, we "anchor" on our own experience and underadjust when trying to understand others.
- Affect Heuristic — Positive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.
- Evaluability — It's difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
- Unbounded Scales, Huge Jury Awards, and Futurism — Without a metric for comparison, estimates of, e.g., what sorts of punative damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.
- The Halo Effect — Positive qualities seem to correlate with each other, whether or not they actually do.
- Asch's Conformity Experiment — The unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what's right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.
- The Allais Paradox (and subsequent followups) — Offered choices between gambles, people make decision-theoretically inconsistent decisions.
Not to be confused with: