Related to: Knowing About Biases Can Hurt People

HT: Marginal Revolution

Paper.

Social psychologists have identified various plausible sources of ideological polarization over climate change, gun violence, national security, and like societal risks. This paper reports a study of three of them: the predominance of heuristic-driven information processing by members of the public; ideologically motivated cognition; and personality-trait correlates of political conservativism. The results of the study suggest reason to doubt two common surmises about how these dynamics interact. First, the study presents both observational and experimental data inconsistent with the hypothesis that political conservatism is distinctively associated with closed-mindedness: conservatives did no better or worse than liberals on an objective measure of cognitive reflection; and more importantly, both demonstrated the same unconscious tendency to fit assessments of empirical evidence to their ideological predispositions. Second, the study suggests that this form of bias is not a consequence of overreliance on heuristic or intuitive forms of reasoning; on the contrary, subjects who scored highest in cognitive reflection were the most likely to display ideologically motivated cognition. These findings corroborated the hypotheses of a third theory, which identifies motivated cognition as a form of information processing that rationally promotes individuals’ interests in forming and maintaining beliefs that signify their loyalty to important affinity groups. The paper discusses the normative significance of these findings, including the need to develop science communication strategies that shield policy-relevant facts from the influences that turn them into divisive symbols of identity.

New to LessWrong?

New Comment
8 comments, sorted by Click to highlight new comments since: Today at 2:03 AM

Here's the design of the study.

A large and diverse sample of Americans were surveyed by a national polling firm, Polimetrix/YouGov. They collected party affiliation and other demographic information, and gave each subject the CRT. After completing the CRT, one group of subjects was told that it was a measure of how reflective and open-minded someone is, and that a recent study had found that people who believe in climate change tend to score better than those who doubt climate change. They were then asked asked whether they thought it was a valid measure of how reflective and open-minded someone is. Another group was given the opposite information (climate change doubters scored better), and there was a control condition.

Results: Democrats were more likely to agree that the test was valid if they were told that climate change believers scored better (vs. if climate change doubters scored better), and Republicans showed the opposite pattern. That was their measure of motivated reasoning. Further, 1) the magnitude of the gap was the same for Democrats as it was for Republicans (they were pretty much perfect mirror images of each other), and 2) the magnitude of the gap was bigger for people who scored higher on the CRT than for those who scored lower (and this was the same for Democrats & Republicans). Look at the graph here.

Additional Results: 1) (As you can see on the graph) Among low-CRT people (R or D), hearing that climate change believers did better had a large impact on opinions about the validity of the test (vs. control condition) but hearing that climate change doubters did better had almost no impact; among high-CRT people both pieces of information had similarly large impacts on opinions about validity. 2) There was also a slight trend for Republicans to score higher than Democrats on the CRT; this correlation was much smaller than the correlation between the CRT and other demographic variables.

Here is the exact text of (one version of) the experimental manipulation:

Psychologists believe the questions you have just answered measure how reflective and open-minded someone is. In one recent study, a researcher found that people who accept evidence of climate change tend to get more answers correct than those who reject evidence of climate change. If the test is a valid way to measure open-mindedness, that finding would imply that those who believe climate change is happening are more open-minded than those who are skeptical that climate change is happening.

How strongly do you agree or disagree with this statement?
I think the word-problem test I just took supplies good evidence of how reflective and open-minded someone is.

In other words, smart, reflective people are better at using those smarts and reflections to play monkey political games, maybe one meta-level up.

Of course, playing politics well is important to effectiveness in real life. Learning about rationality might make you a worse rationalist, but it probably helps you win at life, including if your goal is to, say, promote a movement that is positively correlated with rationality.

You seem to be contradicting yourself. If learning about rationality probably helps you win at life, that means it makes you a better rationalist.

Anyone doing well in life is a rationalist in the sense we use it on LW?

No, but that wasn't what we were talking about. They could have been lucky, but then their success didn't come about because rationality helped them win at life. You're saying a better rationalist will lose more often than a worse rationalist, which is wrong by definition since rationality is the art of accomplishing one's goals effectively.

Your claim appears to be "Rationality is winning." Therefore, if you're not winning you're not rational. Are you winning? [*] If not, are you not a rationalist?

The logic seems circular. You can say "I aspire to win and rationality is whatever gets me there", but that doesn't seem to define the pursuit in question at all.

[*] (This question is not "Can you quickly retcon your current state as 'winning'?")

You could be a really bad rationalist, or you could be not winning for other reasons.

Could one of the people who voted me down please explain why? I don't understand why this is even contentious. Have you read the Wiki? http://wiki.lesswrong.com/wiki/Rationality_is_systematized_winning

The comment I replied to reads like, "Learning to paint might make you make better paintings, but it won't make you a better painter." I replied, "Making better paintings is the definition of a better painter, so you're contradicting yourself." Then I got downvoted. What am I missing?