There's a recent paper(PDF) which finds that people who don't know much about a problem are more inclined to not find out more about that problem. Moreover, the larger scale and more complex a problem looked like, the more likely people were to try to avoid learning more about it, and the more likely they were to trust that pre-existing institutions such as the government could handle the problem. This looks like a potentially interesting form of cognitive bias. It may also explain why people are so unwilling to look at existential risk. There's essentially no issue that occurs on a larger scale than existential risk. This suggests that in trying to get people to understand existential risk, it may make sense to first address the easier to understand existential risks like large asteroids. 

New to LessWrong?

New Comment
3 comments, sorted by Click to highlight new comments since: Today at 10:30 AM
[-][anonymous]12y170

The authors seem to be missing the elephant in the room: global warming is not just any problem, it's a problem that comes loaded with mind-killing intellectual baggage. Even the very first paragraph, in which the authors quote an enraged parent, indicates that political affiliation and signalling are playing a large role in determining people's reactions toward risks. The experiments they perform are likewise tainted with mind-killing topics, particularly the one about peak oil. It's not clear that the subjects' responses are the result of the "need to cope" model that the authors proposed rather than the result of Blue vs. Green thinking.

If the model put forth in the paper is true, though, it is an important insight into ugh fields.

I wonder if this is less of a direct cognitive bias, and actually a fully sensible approach.

I mean, let's face it: How many of us really have any meaningful control whatsoever over any Existential Risks? Even with man-made ones like global warming, any one individual's contribution is such a tiny drop in a colossal bucket that it hardly make a dent. It would take a massive concerted effort to adjust many of these issues, and most people are already over-committed in their lives -- they're not looking for another cause to gobble up their scant time and money.

So sure, if you already know a lot about a topic because you found it interesting, then take steps to feel like you're doing something: vote for candidates who seem like they're tackling the problem, opine on message boards and whatnot. However, for a person who doesn't know anything about the topic (and hence I'm willing to make the leap and say they probably don't find that topic interesting, or else they likely would have), doesn't it make sense that they would say "I'll let the specialists handle this one." ...And who do the "specialists" work for, when it comes to things like the environment, disease control, asteroid protection, etc? More often than not, the government.

What's the alternative? That each one of us runs off and studies every existential risk in our spare time, full-well knowing that our individual involvement will have approximately 0.0% impact upon how this existential threat impacts their lives? That, if anything, sounds like a cognitively flawed approach!

How many of us really have any meaningful control whatsoever over any Existential Risks?

Anyone who has donated to SIAI, plus others.

Oh, sure, everyone thinks two plus two is four, everyone says two plus two is four, and in the mere mundane drudgery of everyday life everyone behaves as if two plus two is four, but what does two plus two really, ultimately equal? As near as I can figure, four. It’s still four even if I intone the question in a solemn, portentous tone of voice.

The Simple Truth