Why We Should Always Distrust Our Certainties

by Ted Reynolds3 min read2nd Oct 20213 comments

0

Rationality
Frontpage

While just thinking around, which I often do when nothing else offers, I recently came across a line of thought which presented an insight into a possible cause of much of the misunderstanding which arises within and between human minds.  Since I have not seen this concept mentioned before, and since others may also find it worth considering, I here put it out for what it may be worth.

Both in our communications with others and in our internal monologues with ourselves, we often employ the words or concepts of “certainty”.  I here wish to examine the meaning that “certainty” holds for others and for ourselves, as well as the effects that it may cause for the accuracy of our mental conclusions.  I will come to the conclusion that in some rather clearly distinguishable conditions, the use of the concept “certainty” serves to obscure, rather than to clarify, the objects of thought.  What it conveys at these times is the following:

“I am firmly convinced of the truth of this. I have no doubt about it.  I have examined it in every way and cannot be mistaken. I certainly do not need to examine it again.”

This feeling will reappear instantaneously whenever this concept enters one’s thoughts.  The result will be that this concept will not be reexamined in any way.  None of any possible observations or reasonings which may have once, perhaps years ago, have supported the belief, will be brought before the mind.  Instead a very strong feeling of being correct, being justified, not being mistaken will both reinforce the belief, and provide a very pleasant feeling that the mind will quite willingly return to on future occasions.  In other words, when the “certainty” feeling is once activated, it becomes more and more unlikely that it ever will be discarded.  If it was once in any way an error, that error will continue to remain and strengthen.

A less insistent form of what I am pointing out is, of course, already familiar to us all in the form “we tend to remember what attracts us and to forget what we do not like to think about,”  The less dogmatic concepts “I think that,” “It appears to me,” or “I am of the opinion,” escape the dangerous magnetism of “I know that,” "I am certain."  But  I think the present analysis brings home more sharply what it means for the high possibility that just those of our beliefs of which we are most certain are most likely to be lacking in adequate authentication.

By the nature of things, these “certain” beliefs will prove to be in the most important matters.  Religious, nationalistic, and romantic beliefs are particularly prone to these fixations.  For this reason, the more people that gain the willpower to habitually investigate (and perhaps modify) their own certainties, the less intolerant nations, fanatical religions, and unworkable marriages with which we will be encumbered.  I would like to think that I could persuade at least some of you to consider this train of thought in your own minds and to share the results.

I believe that what will now happen in the reader’s mind may involve two incompatible consequences: (a), to realize that what I portray is an accurate account of how our thoughts often work, and at the same time (b) to forget it as rapidly and completely as possible.  This is unfortunate.  (Although it might help prove my point.)

But I fear most of you may have already forgotten my main point, namely:

Certainty may present itself to us as an already ascertained fact, when it is often a desirable feeling, to which we cling stubbornly for that reason. This contributes to our frequently accepting false beliefs on the most important matters, thinking that we’ve already established them.

0

3 comments, sorted by Highlighting new comments since Today at 6:15 AM
New Comment

Since I have not seen this concept mentioned before, and since others may also find it worth considering, I here put it out for what it may be worth.

The related LessWrong concept would be https://www.lesswrong.com/posts/QGkYCwyC7wTDyt3yT/0-and-1-are-not-probabilities 

Both in our communications with others and in our internal monologues with ourselves, we often employ the words or concepts of “certainty”.

Reading this I imagine that you have a lot of certainty about what goes on in the internal monologues of other people. This style of writing is relatively untypical for LessWrong.

I would like to think that I could persuade at least some of you to consider this train of thought in your own minds and to share the results.

If you want other people to share results, why not share your own?

I'm afraid that your response is quite irrelevant to my own posting.  If you would like to try again, please note that I am in no way assuming anyone else's interior monologues.  I am asking if other people besides myself also sometimes feel the strong feeling of certainty when the original grounds for that certainty are no longer operative.  That would help explain a good deal of human disagreement and misunderstanding.

Please read my post again.   

Gears (hypotheses, paradigms, theorems, narratives) have a lot of internal certainty and robustness in the face of confusion about priors, even absence of their instances in reality is only a problem of usefulness/relevance, not validity. When talking about the world, summaries of observations such as likelihood ratios are still more robust than beliefs or deeper theoretical interpretations of observations (for example bug reports written by non-programmers make less sense when they don't stick to communicating direct observations).

There is no point to accepting beliefs into one's identity, instead they can be more like cached results of frequently issued queries, something occasionally recomputed from the gears and observations, not meaningfully changed in their own right. It's the changes in the available gears, in understanding, that make this happen, the changes in beliefs are merely symptoms of this process.