Typical Mind and Politics


46


Scott Alexander

Yesterday, in the The Terrible, Horrible, No Good Truth About Morality, Roko mentioned some good evidence that we develop an opinion first based on intuitions, and only later look for rational justifications. For example, people would claim incest was wrong because of worries like genetic defects or later harm, but continue to insist that incest was wrong even after all those worries had been taken away.

Roko's examples take advantage of universal human feelings like the incest taboo. But if people started out with opposite intuitions, then this same mechanism would produce opinions that people hold very strongly and are happy to support with as many reasons and facts as you please, but which are highly resistant to real debate or to contradicting evidence.

Sound familiar?

But to explain politics with this mechanism, we'd need an explanation for why people's intuitions differed to begin with. We've already discussed some such explanations - self-serving biases, influence from family and community, et cetera - but today I want to talk about another possibility.

A few weeks back, I was discussing harms with Bill Swift on Overcoming Bias. In particular, I was arguing that one situation in which there was an open-and-shut case for government restriction of private activity on private property was nuisance noise. I argued that if you were making noise on your property, and I could hear it on my property, that I was being harmed by your actions and that there was clearly just as much a case for government intervention here as if you were firing flaming arrows at me from your property. I fully expected Bill to agree that this was obviously true but to have some reason why he didn't think it applied to our particular disagreement.

Instead, to my absolute astonishment, Bill said that noise wasn't really a problem. He said he lived on a noisy property and had just stopped whining and gotten on with his life. I didn't really know how to react to this1, and ended up assuming either that he'd never lived in a really noisy place like I have, or that he was such a blighted ideologue that he was willing to completely contradict common sense in order to preserve his silly argument.

In other words, I was assuming the person I was debating was either astonishingly stupid or willfully evil. And when my thoughts tend in that direction, it usually means I'm missing something.

Luckily in this case I'd already written a long essay explaining my mistake in detail. In Generalizing From One Example,  I warned people against assuming everyone's mind is built the same way their own mind is. One particular example I gave was:

I can't deal with noise. If someone's being loud, I can't sleep, I can't study, I can't concentrate, I can't do anything except bang my head against the wall and hope they stop. I once had a noisy housemate. Whenever I asked her to keep it down, she told me I was being oversensitive and should just mellow out.


So it seems possible to me that I have an oversensitivity to noise and Bill has an undersensitivity to it. When someone around me is being noisy, my intuitions tell me this is extremely bad and needs to be stopped by any means necessary. And maybe Bill's intuitions tell him that this is a minor non-problem. I won't say that this is actually behind our disagreement on the issue - my guess is that Bill and I would disagree about government regulation of pollution from a factory as well - but I think it contributes and it makes our debate much less productive than it would have been otherwise.

Let me give an example of one place I think a mind difference *is* behind a political opinion. In Money, The Unit of Caring, Eliezer complained that people were too willing to donate time to charity, and too unwilling to donate money to charity. He gave the example of his own experience, where he felt terrible every time he gave away money, but didn't mind a time committment nearly as much. I fired back a response that this was completely foreign to me, because I am happy to give money to charity and often do it before I've even fully thought about what I'm doing, but will groan and make excuses whenever I'm asked to give away time. I also mentioned that this was a general tendency of mine: I have minimal aversion to monetary loss2, but wasting time makes me angry.

A few months ago, Barack Obama proposed a plan (which he later decided against) to make every high school and college student volunteer a certain amount of time to charity. Although I usually like Obama, I wrote an absolutely scathing essay about how unbearably bad a policy this was. It was a good essay, it convinced a number of people, and I still agree with most of the points in it. But...

...it was completely out of character for me. I'm the sort of person who heckles libertarians with "Stop whining and just pay your damn taxes!" Although I acknowledge that many government policies are inefficient, I tend to just note "Hmmm, that government policy is suboptimal, it would be an interesting mental puzzle to figure out how to fix it" rather than actually getting angry about it. This Obama proposal was kind of unique in the amount of antipathy it got from me.

So here's my theory. My brain is organized in such a way that I get minimal negative feelings at the idea of money being taken away from me. We can even localize this anatomically - studies show that the insula is the part responsible for sending a pain signal whenever the loss of money is considered. So let's say I have a less-than-normally-active insula in this case. And I get a stronger than normal pain signal from wasted time. This explains why I prefer to donate money than time to my favorite charity.

And it could also explain why I'm not a libertarian. One consequence of libertarianism is that you have every right to feel angry when you're taxed. But I don't feel angry, so the part of my brain that comes up with rational justifications for my feelings doesn't need to come up with a rational justification for why taxation is wrong. I do feel angry about being made to do extra work, so my brain adopted libertarian-type arguments in response to the community service proposal. I predict that if I lived in one of those feudal countries with a work levy rather than a tax, I'd be a libertarian, at least until the local knight heard my opinions and cut off my head.

And I don't mean to pick on libertarians. I know different people have completely different emotional responses to the idea of other people suffering. For example, I can't watch documentaries on (say) the awful lives on mine workers, because they make me too upset. Other people watch them, think they're great documentaries, and then spend the next hour talking about how upset it made them. And other people watch them and then ask what's for dinner. You think that affects people's opinions on socialism much?

Imagine a proposal to institute a tax that would raise money for some effort to help mine workers in some way. Upon hearing of it, different people would have an emotional burst of pain of a certain size at the thought of hearing of a tax, and an emotional burst of pain of a different size at the thought of considering the mine workers. Neither of these bursts of pain would be proportional to the actual size of the problem as measured in some sort of ideal utilon currency (note especially scope insensitivity). But the brain very often makes decisions by comparing those two bursts of pain (see How We Decide or just the insula article above) and then comes up with reasons for the decision. So all the important issues like economic freedom and labor policy and maximizing utility and suchwhat get subordinated to whether you're secreting more neurotransmitters in response to money loss or images of sad coal miners.

If this theory were true, we would expect to find neurological differences in people of different political opinions. Ta da! A long list of neurological findings that differ in liberals and conservatives. Linking the startle reflex and the disgust reaction to the policies favored by these groups is left as a (very easy) exercise for the reader3.

This may require some moderation of our political opinions on issues where we think we're far from the neurological norm. For example, I am no longer so confident that noise is such a big problem for everyone that we would all be better off if there were strict regulations on it. But I hope Bill will consider that some people may be so sensitive to noise that not everyone can just shrug it off, and so there may be a case for at least some regulation of it. Likewise, even though I don't mind taxes too much, if my goal is a society where most people are happy I need to consider that a higher tax rate will decrease other people's happiness much more quickly than it decreases mine.

Other than that, it's just a general message of pessimism. If people's political opinions come partly from unchangeable anatomy, it makes the program of overcoming bias in politics a lot harder, and the possibility of coming up with arguments good enough to change someone else's opinion even more remote.

Footnotes

1) I am suitably ashamed of my appeal to pathos; my only defense is that it is entirely true, that I have only just finished moving, and that this post is hopefully a more appropriate response.

2) Actually, it's more complicated than this, because I agonize over spending money when shopping. I seem to use different thought processes for normal budgeting, and I expect there are many processes going on more complex than just high versus low aversion to money loss.

3) Possibly too easy. It's easy to go from that data to an explanation of why conservatives worry more about terrorism, but then why don't they also worry more about global warming?