Today's post, Knowing About Biases Can Hurt People was originally published on April 4, 2007. A summary (from the LW wiki):

Learning common biases won't help you obtain truth if you only use this knowledge to attack beliefs you don't like. Discussions about biases need to first do no harm by emphasizing motivated cognition, the sophistication effect, and dysrationalia, although even knowledge of these can backfire.

Discuss the post here (rather than in the comments of the original post).

This post is part of a series rerunning Eliezer Yudkowsky's old posts so those interested can (re-)read and discuss them. The previous post was The Majority is Always Wrong, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it, posting the next day's sequence reruns post, summarizing forthcoming articles on the wiki, or creating exercises. Go here for more details, or to discuss the Sequence Reruns.

4 comments, sorted by Click to highlight new comments since: Today at 12:55 PM
New Comment

I don't understand what makes learning about cognitive biases intrinsically different from obtaining any other type of knowledge. That is, couldn't you make a parallel argument that learning math (or any rationality skill) is dangerous unless it is applied evenhandedly to your own beliefs and to the beliefs of others?

The problem is that it's much easier to apply knowledge about biases to dismiss people who disagree with you as biased then to apply knowledge about math to...I'm not even sure what the analogous thing you're thinking about is.

What's the evidence that knowing about cognitive biases is more dangerous than knowing math? My claim is that it is just as easy to apply math in an unbalanced way that favors one's already-held beliefs as it is to apply cognitive biases in a similarly unbalanced way.

In other words, why did EY speak specifically to cognitive biases as opposed to the general problem of using your knowledge more vigilantly to attack others arguments than to attack your own arguments?

I dispute your claim. It doesn't seem, to me, that it would be anywhere near as easy to translate an understanding of maths into a Fully General Counterargument, as it would be to do so with an understanding of cognitive biases. If someone disagrees with me, I can readily call to mind a number of cognitive biases of which I could accuse my opponent, which would, at least at the surface level, appear relevant. This would with high likelihood undermine his position in the eyes of (human!) observers even if my accusations are not true.

On the other hand I am struggling to imagine how I could do the same with my understanding of mathematics. This doesn't mean it's not possible, but it certainly seems a lot more difficult.