Another way our brains betray us

by polymathwannabe1 min read17th Sep 201311 comments


Personal Blog

This appeared in the news yesterday.

It turns out that in the public realm, a lack of information isn’t the real problem. The hurdle is how our minds work, no matter how smart we think we are. We want to believe we’re rational, but reason turns out to be the ex post facto way we rationalize what our emotions already want to believe.


The bleakest finding was that the more advanced that people’s math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem. [...] what these studies of how our minds work suggest is that the political judgments we’ve already made are impervious to facts that contradict us.


Denial is business-as-usual for our brains. More and better facts don’t turn low-information voters into well-equipped citizens. It just makes them more committed to their misperceptions.


When there’s a conflict between partisan beliefs and plain evidence, it’s the beliefs that win. The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.

11 comments, sorted by Highlighting new comments since Today at 5:58 AM
New Comment

Politics is the math-killer.

I noticed it for the first time when I was at university. There was a political discussion about impacts of privatizing, and some guy said: "The prices of electricity have increased by 10%, and the prices of heat have increased by 10%, and the prices of water have increased by 10%... which means that for people the total costs of living have increased by 30%..." and many students were nodding in agreement, and I was like: WTF is happening here? Here is the best mathematical university in our country, and people are seriously believing that if you split something into small parts and increase three of them by 10%, then the whole has increased by 30%? How is that even possible? I was already aware that politics can make people crazy, but I didn't imagine it could ruin their mathematical skills so completely. It took some time to explain, and even then instead of admitting an error some people offered some rationalization instead (such as: the 10% was the estimate for now and the 30% was the extrapolation for the future; or that they meant "more than 10%" and "almost 30%", which is not completely exclusive; etc.). At that time I already knew that most people hate to admit they made an error.

direct link to paper

Relevant quote from popular source:

In Kahan’s experiment, some people were asked to interpret a table of numbers about whether a skin cream reduced rashes, and some people were asked to interpret a different table – containing the same numbers – about whether a law banning private citizens from carrying concealed handguns reduced crime. Kahan found that when the numbers in the table conflicted with people’s positions on gun control, they couldn’t do the math right, though they could when the subject was skin cream. The bleakest finding was that the more advanced that people’s math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem.

The rash cream experiment is a randomized trial, whereas the concealed-weapon ban is not. That's not enough to explain away the effect, but is consistent with the hypothesis that people put more weight on priors in confusion situations (and people who are better at math find the gun control scenario relatively more confusing).

The bleakest finding was that the more advanced that people’s math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem.

So, my first thought was that this might be a trivial statement. If any person is 30% likely to give the wrong answer for partisan reasons, but the chance of getting it wrong depends on innumeracy, then partisanship will be the dominant reason for error among highly numerate people, because it's basically the only source of error, even though total error is lower.

But turns out this is a rather meaningful statement (as shown by Figure 6 of the paper). Among liberals, numeracy has basically no impact on whether or not they correctly answered the political question when it disagreed with their priors (which is, frankly, horrifying). Conservatives who scored 8 or 9 (the max score) on numeracy were slightly better at giving the answer that went against their political position than less numerate conservatives, but only slightly. When the answer supported by the data supports the expected preconceived notion, both conservatives and liberals saw a positive relationship between numeracy and giving the correct answer. (Oddly, for conservatives the relationship looks linear, but for liberals it's flat-ish then sharply positive.)

That seems odd. What's the difference between being handed a a list of numbers that support gun control being useless and being handed a list of numbers that you misinterpret as supporting gun control being useless?

I would have expected numeracy to be completely independent of the answer....

My take away from this is that you need to "shut up and multiply" every single time. Looking at the math skills study, the thought was that people glance at the raw numbers (instead of looking at the ratios) and stop there if they fit their ideological beliefs. If it conflicts with your beliefs though you spend a little longer and figure out you need to look a the ratio. So if we train ourselves to always "shut up and multiply" hopefully some of this effect will go away. Maybe a follow-up study to see if people who actually do the math still get it wrong?

If preconceived notions make it impossible to do math, then how can we possibly get a result that contradicts with our preconceived notions?

I don't think the article suggests it's literally impossible (some of the respondents passed the test), just terribly hard.

This is news? I'm pretty sure I've been hearing about this exact experiment regularly for years.

[-][anonymous]7y 0

Generic post title is unspecific.

[This comment is no longer endorsed by its author]Reply

The question is what that emotional core is and where it comes from, because that's what your utility function is. Rationalists emotionally value the perception of being correct and like to use reason to attain this perception. In order to apply rationality equally to all your mappings of the territory, you have to have no other emotional impulses that conflict with decoding the raw truth instead of the truth you prefer to see. If you just really like guns and perceiving yourself as correct, and don't value anything else, you'll do everything you can to minimize people giving you trouble because you own firearms. If this means never shooting other agents, so be it.

I can guess this comment won't be popular the moment I formally assert that the brain is ultimately just a Bayesian calculating machine, with the utility functions (determined via deep emotional decision) being the only fundamental difference between two people. Of course, up at the level of actual human interaction, the conscious mind isn't the Bayesian agent. The real Bayesian agent is the emotional core that's influencing all the higher-order operations of your brain—consciousness and the reasoning it perceives included. This information can only really be processed by agents with a high ratio of utilitarianism in their utility function; those among you that actually care about what other people think. Might as well be speaking Greek to anyone else.