Belief in Belief vs. Internalization

by Desrtopa1 min read29th Nov 201059 comments


Self-DeceptionMotivated ReasoningAnticipated ExperiencesRationality

Related to Belief In Belief

Suppose that a neighbor comes to you one day and tells you “There’s a dragon in my garage!” Since all of us have been through this before at some point or another, you may be inclined to save time and ask “Is the dragon by any chance invisible, inaudible, intangible, and does it convert oxygen to carbon dioxide when it breathes?”

The neighbor, however, is a scientific minded fellow and responds “Yes, yes, no, and maybe, I haven’t checked. This is an idea with testable consequences. If I try to touch the dragon it gets out of the way, but it leaves footprints in flour when I sprinkle it on the garage floor, and whenever it gets hungry, it comes out of my garage and eats a nearby animal. It always chooses something weighing over thirty pounds, and you can see the animals get snatched up and mangled to a pulp in its invisible jaws. It’s actually pretty horrible. You may have noticed that there have been fewer dogs around the neighborhood lately.”

This triggers a tremendous number of your skepticism filters, and so the only thing you can think of to say is “I think I’m going to need to see this.”

“Of course,” replies the neighbor, and he sets off across the street, opens the garage door, and is promptly eaten by the invisible dragon.

Tragic though it is, his death provides a useful lesson. He clearly believed that there was an invisible dragon in his garage, and he was willing to stick his neck out and make predictions based on it. However, he hadn’t internalized the idea that there was a dragon in his garage, otherwise he would have stayed the hell away to avoid being eaten. Humans have a fairly general weakness at internalizing beliefs when we don’t have to come face to face with their immediate consequences on a regular basis.

You might believe, for example, that starvation is the single greatest burden on humanity, and that giving money to charities that aid starving children in underdeveloped countries has higher utility than any other use of your surplus funds. You might even be able to make predictions based on that belief. But if you see a shirt you really like that’s on sale, you’re almost certainly not going to think “How many people will go hungry if I buy this who I could have fed?” It’s not a weakness of willpower that causes you to choose the shirt over the starving children, they simply don’t impinge on your consciousness at that level.

When you consider if you really, properly hold a belief, it’s worth asking not only how it controls your anticipations, but whether your actions make sense in light of a gut-level acceptance of its truth. Do you merely expect to see footprints in flour, or do you move out of the house to avoid being eaten?