There's a recent paper(PDF) which finds that people who don't know much about a problem are more inclined to not find out more about that problem. Moreover, the larger scale and more complex a problem looked like, the more likely people were to try to avoid learning more about it, and the more likely they were to trust that pre-existing institutions such as the government could handle the problem. This looks like a potentially interesting form of cognitive bias. It may also explain why people are so unwilling to look at existential risk. There's essentially no issue that occurs on a larger scale than existential risk. This suggests that in trying to get people to understand existential risk, it may make sense to first address the easier to understand existential risks like large asteroids.