Some Remarks on the Nature of Political Conflict

16


[Part of the debate around: Conflict Vs. Mistake]

[Criticizing articles like: In defence of conflict theory, Conservatives As Moral Mutants (part of me feels like the link is self-trigger-warning, but I guess I will just warn you that this is not a clever attention-grabbing title, the link means exactly what it says and argues it at some length)]

[Related to: Knowing About Biases Can Hurt People, Would Your Real Preferences Please Stand Up?, The Cowpox of Doubt, Guided By The Beauty Of Our Weapons]

[Epistemic effort: I thought of this argument and was so pleased by my own cleverness that I decided to post it.]

[Note: I have a nagging feeling I’ve spent a thousand words spelling out something completely obvious. Still, I hope there’s value in actually spelling it out.]

There has been a flurry of discussion around the nature of political conflict in the rationality movement for the last five months, sparked by a blog post by Scott Alexander on his blog Slate Star Codex making a dichotomy between mistake theorists who think their political opponents are mistaken on factual policy questions and conflict theorists who think their political opponents are innately evil. There have been a lot of good articles on the subject on every side and on both the object-level and the meta-level (well, on both the meta-level and the meta-meta-level), but also many bad ones resting on mistakes (I know, I am showing my side here).

One class of pro-conflict-theory arguments that bother me a lot goes like this:

Mistake theory can't be the correct worldview because, for example, it's historically documented that tobacco companies hired scientists to spread misinformation about whether smoking causes cancer instead of thinking about it in a rational way.

Other historical case studies used include the rise of liberal democracy, the abolition of slavery, giving women the right to vote, the end of segregation, etc.

A scientific theory that is often used in this kind of argument is Jonathan Haidt's work on political psychology. Jonathan Haidt, with his co-conspirator Jesse Graham, created moral foundations theory, according to which morality is divided into five foundations:

  • Care: cherishing and protecting others; opposite of harm
  • Fairness or proportionality: rendering justice according to shared rules; opposite of cheating
  • Loyalty or ingroup: standing with your group, family, nation; opposite of betrayal
  • Authority or respect: submitting to tradition and legitimate authority; opposite of subversion
  • Sanctity or purity: abhorrence for disgusting things, foods, actions; opposite of degradation

A shocking and unexpected discovery of moral foundations theory is that conservatives value Loyalty, Authority, and Sanctity more than liberals do. (Liberals also value Care and Fairness more than conservatives do, but this effect is magnitudes smaller than the other one.) Some conflict theorists, both liberal and conservative, have seized over this to claim conflict theory is correct and those darned Blues are moral mutants who can't listen.

This is the popular understanding of moral foundations theory, anyway. In reality, this is only pluralism, the fourth claim of moral foundations theory. The four claims of moral foundations theory are¹:

  1. Nativism: There is a “first draft” of the moral mind
  2. Cultural learning: The first draft gets edited during development within a particular culture
  3. Intuitionism: Intuitions come first, strategic reasoning second
  4. Pluralism: There were many recurrent social challenges in our evolutionary history, so there are many moral foundations

The third claim is intuitionism. Social intuitionism, as a psychological theory, is older than the moral pluralism that is often equated with moral foundations theory in pop science. Jonathan Haidt wrote about it in 2001, years before he wrote about moral pluralism. Social intuitionism is a model that proposes that moral positions and judgments are²:

  1. primarily intuitive ("intuitions come first")
  2. rationalized, justified, or otherwise explained after the fact
  3. taken mainly to influence other people, and are
  4. often influenced and sometimes changed by discussing such positions with others

If you look at what you think is moral foundations theory (but is actually only moral pluralism without the background of social intuitionism that is necessary to fully understand it), you might get the impression that people with different moral intuitions than you consciously do so. The reality is much much worse than that. Let's say Pro-Skub people value Skub and Anti-Skub people don't. Pro-Skub People don't know that their moral positions and judgments are primarily intuitive. They don't know that intuitions come first. They rationalize it, justify it, and otherwise explain it after the fact. Similarly, Anti-Skub people will rationalize their not valuing of Skub, justify it, and otherwise explain it after the fact.

This is very different from what popular misunderstanding suggest ! Popular misunderstanding suggest that you can trust your brain to be correct about the value of Skub, given the only reason that your opponents do/don't value Skub is that they have different terminal values than you. In reality, social intuitionism say that your brain is broken, is rationalizing its reasons to value or not value Skub, and your opponents' brain are also broken in the same way. Social intuitionism say that you can't trust your broken brain.

Rationalization is, of course, not limited to moral positions and judgments. It and its buddies confirmation bias and motivated cognition wander everywhere. It's not a coincidence that Motivated Stopping and Motivated Continuation specifically use the example of tobacco science. But you - yes, you - aren't immune from rationalization, confirmation bias, or motivated cognition. You can't trust your brain to not do it. You can't trust your brain to not be the next conflict theorist case study.

Luckily, the fourth tenet of social intuitionism is that moral positions and judgments are often influenced and sometimes changed by discussing such positions with others. Your best way to not let your brain be the next conflict theorist case study is to deliberately exploit this as best you can. To not let your brain be the next conflict theorist case study, debate is essential. We all bring different forms of expertise to the table, and once we all understand the whole situation, we can use wisdom-of-crowds to converge on the correct answer. Who wins on any particular issue is less important creating an environment where your brain won't be the next conflict theorist case study.

What's the worst thing you could do in your quest to not let your brain be the next conflict theorist case study ? Probably treat everything as war and viewing debate as having a minor clarifying role at best. That's the best way for rationalization, confirmation bias, motivated cognition, and self-serving bias to creep in. This is how most of the conflict theorist case studies thought.

Mistake theory is the correct worldview precisely because tobacco companies hired scientists to spread misinformation about whether smoking causes cancer instead of thinking about it in a rational way.

¹: Graham, Jesse and Haidt, Jonathan and Koleva, Sena and Motyl, Matt and Iyer, Ravi and Wojcik, Sean P. and Ditto, Peter H., Moral Foundations Theory: The Pragmatic Validity of Moral Pluralism (November 28, 2012). Advances in Experimental Social Psychology, Forthcoming. Available at SSRN: https://ssrn.com/abstract=2184440

²: Haidt, Jonathan (2012). The Righteous Mind: Why Good People Are Divided by Politics and Religion. Pantheon. pp. 913 Kindle ed. ISBN 978-0307377906.

16