A feeling of certainly should usually be regarded as evidence of cognitive bias.

https://www.lesswrong.com/tag/absolute-certainty offers an extreme version of this principle. Absolute certainty requires a standard of evidence so high that we are more likely to have made a mistake than to truly be justified in claiming such certainty.

Less extreme forms of certainty do not necessarily require cognitive bias. For example I feel certain that the Sun will rise over my house in the morning. If challenged, I will admit that the odds are not 100%, and will give an estimate based mostly on the odds of my house not existing in the morning. Most people will accept that both my feeling and my estimate are rationally justifiable. Though maybe with quibbles about how the Sun does not truly rise.

But certainty can be tied to cognitive biases. Cognitive biases make it easy to think some things, and not their opposites. That leads to a feeling of certainty. Now suppose that we commit to the position that we're certain of. Now evidence against what we believe will cause cognitive dissonance - leading us to reject the evidence. Which means that certainty causes a cognitive bias which reinforces our certainty!

So when we encounter a feeling of certainty in ourselves or others, we should ask whether that certainty is because we have overwhelming evidence, or a cognitive bias. Bayesian reasoning tells us that the stronger the sense of certainty, and the more complex the chain of evidence to the conclusion, the more likely it is that we have encountered cognitive bias. Therefore, a feeling of certainty is evidence of cognitive bias.

To be clear, it always feels that your certainty is based on evidence. Which is why it is useful to realize that you should be extra suspicious of cognitive bias.


That's a nice sounding theory, but can I provide data suggesting that it really happens that way?

First, let's turn to Philip Tetlock's book Expert Political Judgement. Tetlock started with a group of bona fide experts. He asked them questions about their credentials and beliefs. And then asked them to make predictions about the future. Following Isaiah Berlin, he then divided them into foxes and hedgehogs with the hedgehogs having a big idea they were certain of, while foxes balanced many approaches.

On average, predictions were basically random. But the mistakes were far from equal. The hedgehogs felt certain, so I would predict that they were thinking poorly. And indeed, their predictions were significantly worse than chance. By contrast, the foxes predicted future events at significantly better than chance would suggest.

But the story doesn't end there! We would hope that the better thinkers, the foxes, would be rewarded. But it was the opposite. The certain hedgehogs got the bulk of the lucrative pundit positions, such as talk show hosts.

Why? My best guess is that we like outsourcing our thinking. When we encounter someone who is smart, knowledgeable, certain, and says things that we like, it is comfortable to outsource our thinking to them. Surely if we were that smart and learned that much, we'd agree, we'd decide the same. So why go through that work? We can just repeat what they say and sound smart ourselves!

And so we are biased to accept the thinking of people who are probably not thinking well themselves!


I wish that the story ended there. But in https://rationaldino.substack.com/p/too-important-to-be-able-to-think I give a number of example of yet another dynamic leading to the same problem. If I really want to be good at X, it is easy for me to convince myself that I am good at X. Once I'm convinced, I will reject all evidence that I might not be good at X. Because I reject it, I stop improving. Because I stop improving, there's a pretty good chance that I'm not nearly as good at X as I think I am.

Sadly, this does not simply happen for random values of X. This dynamic is most likely for the things that are most important. Which means that my certainty of doing well was not just evidence of cognitive bias. Had I taken the warning seriously, it would have been evidence of where my cognitive biases were making it likely that I'd make my most painful life mistakes...

But that's getting far afield of the main point. My main point is to be on the lookout for certainty in yourself and others. When you encounter it, actively look for reasons why that person might have a cognitive bias. Because there is a decent chance that cognitive biases are causing the feeling of certainty.

New to LessWrong?

New Comment
13 comments, sorted by Click to highlight new comments since: Today at 8:45 PM

This is an interesting post. Your linked blog also is very personal and kind and I appreciate that.

My experience is that the things that I consider myself good at are things that interest me... They are the things I am actively reading about and looking for others to share my interest. Therefore it seems to me the things I say I am good at are the things that I am actively working towards being good at. Quite the opposite of the last portion of this post.

I think certainty isn't necessarily more wrought with error. In fact, certainty SHOULD be less error prone.

I see your points though. There are some things that are so valuable to us we couldn't admit we are uncertain. Instead of doing the calculation for how certain we are we just say, "I NEED to be right here so I am certain". I agree it's important to recognize this bias but i don't think certainty is the marker for the bias. I think certainty is 99% great and 1% bad (as in the cases you warn about)

A way to detect the bias you are concerned about would be to try to identify character traits that are most important to you. Ask what would hurt the most to find out weren't true. Then evaluate yourself on those traits and identifying characteristics and recognize that you may be biased here. I don't mean to say this is easy or I have mastered it, just that it might be more dependable than certainty.

I am not sure if that's the best way to ID the bias, but that's my thoughts.

I agree that when you feel sure of your reasoning, you are generally more likely right than when you aren't sure.

But when you cross into feeling certain, you should suspect cognitive bias. And when you encounter other people who are certain, you should question whether they might also have cognitive bias. Particularly when they are certain on topics that other smart and educated people disagree with them on.

This is not a 100% rule. But I've found it a useful guideline.

If I really want to be good at X, it is easy for me to convince myself that I am good at X.

I boggle. What alien mind is it that thinks this way?

ETA to amplify that: If I attempt to play a musical instrument, I am immediately aware of how well or badly I am playing. If I try to read a foreign language, I can immediately tell how well I understand it. If I try to speak it, it will be evident how well I am being understood. When I lift weights in the gym, I know exactly how much weight I am lifting and how many times I can lift it. How well I invest my money shows up in my bank balance.

In what spheres of activity is it "easy for me to convince myself that I am good at X" if I am not, in fact, good at X?

I am good at making correct moral decisions.
I am good at communicating.
I am good at tolerance, and patience, and humility.
I am good at deciding beneficial national policies and priorities.
I am good at driving.
I am good at my job.
I am good at coming up with lists of examples.

I am good at thinking concretely, as demonstrated by my immediate reaction to these:

I don't know what "good at making correct moral decisions" looks like, let alone "good at deciding beneficial national policies and priorities", which will only be for historians to judge, and they'll disagree among themselves anyway. To know how good I am at communicating, look at the outcomes, and do not think "I'm good at communicating, he's just stupid!" "Good at tolerance, and patience, and humility" looks like the actual behaviours that these describe, and does not look like blaming the other person for trying one's tolerance, and patience, and humility. "Good at driving" looks like not being in accidents, not being frequently tooted at for dawdling, being aware of how aware I am being of the various hazards on the road, keeping one's vehicle in good running order, and so on; and does not look like saying "but it was the other driver's fault" after being in an accident. "Good at my job" looks like getting the things done that the job consists of, a progressing career, earned money in the bank, overt recognition by peers, and so on; and does not look like complaining about the injustice of the world if these things are not happening. And all of these judgements of "good at" involve recognising when one has fallen short, so that one may become better at the thing.

Besides, I doubt I have ever had occasion to think "I am good at..." whatever. (The first sentence of this comment is just rhetorical parallelism.) I would think instead, "this is how good or bad I am at", because there is always someone better, and someone worse.

So, that is how I think about such things.

I don't know what "good at making correct moral decisions" looks like

Maybe you don't know, but at least millions of people, in the judicial systems of nearly every country, claim they do. And these folks, publicly announce, that they have the power to prosecute and decide your fate the moment you step on their territory, partially or sometimes entirely based on their certitude in their 'correct moral decisions'. 

So I think JBlack was pointing out that it seems a bit odd that you could be unaware.

I don't recall seeing such people say so. They are there in various roles to apply the law as best they can. They make various judgements, moral and otherwise, but where do they go about saying that they are good at making these judgements? When a decision must be made, one cannot infer anything about the certitude with which it is made.

When a decision must be made, one cannot infer anything about the certitude with which it is made.

It seems your a bit confused here? Prosecutors in many countries have great leeway to pick and choose. And even after choosing to prosecute someone, at each step along the way they have nearly complete leeway to pick and choose whether to continue on to the next step until final judgement or just drop it one day.

There are very rarely cases where they 'must' make a decision on a particular individual, particularly in the US.

 

I don't know of any countries where the opposite is true, perhaps you know of one?

Prosecutors in many countries have great leeway to pick and choose.

I.e. to make decisions. Everything they do in their jobs involves a decision to do that thing. I am not clear how your reply relates to my comment. And none of this relates to your claim that these people are claiming to be "good at making correct moral decisions".

Perhaps you are unfamiliar with how prosecution systems or the judiciary works in general? 

To elaborate, there are several stages along which the 'decisions' of the prosecutors have close to zero impact, if the case does get dropped before final judgement. Even if there is solid evidence of guilt available from the beginning. 

At least in common law countries.

But that also doesn't prevent a prosecutor from going all the way with an actually innocent person based on  a belief in being "good at making correct moral decisions" and deciding their fate. 

For example, if they have a dozen cases on their desk, half actually innocent, half actually guilty, there  simply is no 'must' there. They could decide to drop all of them, drop none, decide based on gut feel, etc... 

And the default is to do nothing and let the paperwork sit and collect dust until the next prosecutor comes in, who can also do the same thing, etc., until the case gets too old and is automatically closed.

Only a small fraction makes it into any court at all, and only a small fraction of those ever go all the way through. Sustained partially, or sometimes entirely, based on their certitude in their 'correct moral decisions'. 

One example is the kind of person who began to learn something, worked at it, and became good at it compared to their friends. Without context for what "good" really means in the outside world, it is easy to believe that you are good.

In my blog I gave the example of myself as a teenager in chess. I could usually beat everyone in my school except my brother, so I felt like a good player.

But my competitive rating would have probably been about 1200-1400. I still remember my first encounter with a good chess player. A master was sitting in public, playing simultaneously against everyone who wanted to play him. I sat down, promptly lost, played again and lost again. He gave me some advice beginning with, "Weak players like you should focus on..."

I took offense, despite having just received evidence that he knew what he was talking about when it came to chess.

While I learned better, I've now been on the other side of this interaction in a number of areas. Including ping-pong and programming. Which suggests that my younger self was hardly unique in my overestimation of my abilities.

Indeed, growing up in a small pond and then discovering the wider world can be a shock. The star high school student may discover they are only average at university. But one learns, as you learned about your chess.

You would be amazed at what lengths many go to never learn.

Ever heard the saying (variously attributed) that A level people want to be around other A level people while B level people want to be around C level people?

A lot of those B level people are ones who stop getting better because they believe themselves to already be good. And they would prefer to surround themselves with people who confirm that belief than risk challenging themselves.

Furthermore, it is easier to maintain illusions of superior competency when it isn't competitive. It was a lot easier for me to hide from ways in which I was a bad husband than to hide from the fact that I was losing at chess. There isn't really an objective measure of being a poor husband. And continuing doing what I already did was constant evidence to me that I was a good husband. So my illusions continued until some of the same problems showed up in my next relationship.