Belief as Attire

I have so far distinguished between belief as anticipation-controller, belief in belief, professing and cheering.  Of these, we might call anticipation-controlling beliefs "proper beliefs" and the other forms "improper belief".  A proper belief can be wrong or irrational, e.g., someone who genuinely anticipates that prayer will cure her sick baby, but the other forms are arguably "not belief at all".

Yet another form of improper belief is belief as group-identification—as a way of belonging.  Robin Hanson uses the excellent metaphor of wearing unusual clothing, a group uniform like a priest's vestments or a Jewish skullcap, and so I will call this "belief as attire".

In terms of humanly realistic psychology, the Muslims who flew planes into the World Trade Center undoubtedly saw themselves as heroes defending truth, justice, and the Islamic Way from hideous alien monsters a la the movie Independence Day.  Only a very inexperienced nerd, the sort of nerd who has no idea how non-nerds see the world, would say this out loud in an Alabama bar.  It is not an American thing to say.  The American thing to say is that the terrorists "hate our freedom" and that flying a plane into a building is a "cowardly act".  You cannot say the phrases "heroic self-sacrifice" and "suicide bomber" in the same sentence, even for the sake of accurately describing how the Enemy sees the world.   The very concept of the courage and altruism of a suicide bomber is Enemy attire—you can tell, because the Enemy talks about it.  The cowardice and sociopathy of a suicide bomber is American attire.  There are no quote marks you can use to talk about how the Enemy sees the world; it would be like dressing up as a Nazi for Halloween.

Belief-as-attire may help explain how people can be passionate about improper beliefs.  Mere belief in belief, or religious professing, would have some trouble creating genuine, deep, powerful emotional effects.  Or so I suspect; I confess I'm not an expert here.  But my impression is this:  People who've stopped anticipating-as-if their religion is true, will go to great lengths to convince themselves they are passionate, and this desperation can be mistaken for passion.  But it's not the same fire they had as a child.

On the other hand, it is very easy for a human being to genuinely, passionately, gut-level belong to a group, to cheer for their favorite sports team.  (This is the foundation on which rests the swindle of "Republicans vs. Democrats" and analogous false dilemmas in other countries, but that's a topic for another post.)  Identifying with a tribe is a very strong emotional force.  People will die for it.  And once you get people to identify with a tribe, the beliefs which are attire of that tribe will be spoken with the full passion of belonging to that tribe.


98 comments, sorted by
magical algorithm
Highlighting new comments since Today at 2:47 AM
Select new highlight date
Moderation Guidelines: Reign of Terror - I delete anything I judge to be annoying or counterproductiveexpand_more

Paul, I looked up a list of the most religious states in the US. But if you actually go into an Alabama bar and say it, I'll change the post (not recommended).

I'm not about to put my money where my mouth is on that one

Or maybe it's a matter of existential risk? If there's a 1/10 chance of him being horribly wrong, then I don't particularly blame him for not testing it. I might believe quite thoroughly, but not want to test it when the explosive is directly in front of me.

I'd happily test it from behind a blast wall, though.

This might be an especially easy category of bias to identify. Just ask yourself if you feel proud that this belief associates you with some group with which you want to be associated. If so, weaken your confidence in this belief.

Up to and including my belief that the scientific method is the best approach to understanding the world?

Absolutely, yes. No question that part of the reason we believe that is in order to identify with our tribe. I personally am prepared to reduce it by three orders of magnitude, from 1-epsilon to 1-1000*epsilon.

Eliezer, I have thought of another sort of belief that is not an anticipation-controller. Sometimes, I hear quite smart young people (who don't just wear beliefs as attire) profess to a belief in physicalism about qualia, or in libertarianism, or in the virtues of the scientific method, or in anti-pseudoscience (a la Martin Gardner), or in global-warming skepticism (a la Bjorn Lomborg), or in consequentialist egoism, or some similar broad philosophical or political doctrine. When I talk to these people, I find out that they can give a number of good arguments for why someone should follow their position, but that they have little to say in response to arguments for why people should follow alternative positions. For example, they might be able to clearly state various arguments for libertarianism, and to respond well to counters to those arguments. Yet when I tell them various arguments in favor of alternative positions (e.g. democratic socialism), their attempted rebuttals are much weaker in quality than their positive arguments for the position they claim to hold.

This usually occurs because these people have read good books or articles advocating physicalism, libertarianism, and egoism, and have learned (and been convinced by) the arguments contained therein. After reading these books, these people want to talk to others about what they've read and show that they can understand and reconstruct difficult arguments. They could just say to their acquaintances something like "I've read this book and I'd like to discuss some of the arguments in it with you". But, for various reasons, they often instigate such conversations by saying: "I don't think there should be any income tax at all. nor any other taxes. I'm a libertarian." From here, a heated discussion may ensue about the merits of libertarianism, in which the neophyte can relate all his carefully reconstructed arguments to his audience. This allows the new libertarian to look clever (since he can relate good arguments) and well-read (since he can quote Nozick's views on politics). It also provides the libertarian with practice in thinking and arguing on the spot, and in articulating difficult ideas.

I don't count this as belief as anticipation-controller. [I'll leave aside questions about whether beliefs in political or ethical doctrines can have empirically testable consequences. The point I'm making works just as well with global-warming skepticism as it does with libertarianism or dualism.] The person who calls himself a libertarian or a global warming skeptic after reading a couple of books and a few articles arguing for libertarianism or global-warming skepticism will often acknowledge (if honest) that if he'd started by reading books advocating alternative views, then he would not have come to be a libertarian or global-warming skeptic. He knows that he hasn't made an attempt to hear views from both sides of contentious issues, despite their being very smart and thoughtful people advocating opposing.

Yet this lack of balance in his reading is not a problem for him. He is not actually going to act on his belief in any serious way. He's not about to give his money and time to support global-warming skepticism or libertarianism. He would probably not bet money on these doctrines being true, unless the bet was a small enough fraction of this wealth that it would be worthwhile to garner more attention. When he says "I'm a libertarian", what he means is "I can articulate a number of good arguments for libertarianism, along with replies to common objections to these arguments".

I think that some professional philosophers hold beliefs in a similar way. The philosopher might come up with some clever arguments for position X. Instead of writing up a paper or blog post that simply relates these clever arguments, he will probably write an article or book that gives lots of arguments for position X, including his new clever arguments (e.g. on anti-dualism). In spending lots of time studying all the good arguments for X (and in refining his own clever arguments), he will end up with an impressive ability to make arguments in support of X. Yet he probably won't have given the same open-minded and lengthy study to arguments in support of alternatives to X (i.e. not-X). Hence, when he says to people that he is an Xist or that he believes in X, what he means is "I can give lots of really sophisticated arguments for X".

(This is not true of all philosophers, as some seem to strive to criticize their own positions. Also, it is not just philosophers that are guilty of this. Some scientists will spend their lives doing experiments that provide evidence for some view X, and they won't have invested as much time in learning about the experiments and arguments of people who have been trying to show that not-X.)

Good point, Tarleton - although I'm still hard-pressed to think of a better example that isn't directly a religious belief. If you only use the obvious religious examples, people will fall into the standard trap of thinking they've achieved perfection as rationalists because they're not religious - I wanted to use something that would actually strike a sympathetic chord and let people see how the belief-as-attire effect extends beyond religion. Got a better suggestion?

Vassar, also a good point, although I'm skeptical that I would have difficulty empathizing - these are humans we're talking about, not aliens, and the WTC hijackers were mostly educated Saudi Arabians, not Yamomano. They saw themselves as heroes in the support of causes, such as sexual decency = woman's de-emancipation, which are not American causes; they believed and maybe even anticipated 72 virgins; they fought in guardianship of ancient perfection; they carried out the will of God revealed in perfect scripture. None of this strikes me as a significant barrier to understanding. Can you say specifically what you think presents the barrier to empathy?

You want some belief-as-attire that LW people wear? How about some of the things people say "as Singularitarians", i.e. not because they really have thought the matter through themselves, but because it is the standard position of Singularitarians.

(It's not always easy to distinguish, granted. You could believe in cryonics because you really have evidence to suggest that this is the best use of resources... or you could believe in cryonics because that's "what Singularitarians believe".)

Good posts. This series is the first thing in a while to make me really glad to participate here.

I think that the stereotype of Alabama bars is pretty reliable. OTOH, the stereotype of suicide bombers is much much less so. If you read the rhetoric of radical Islam, or for that matter if you read ancient mythology such as Homer or the Egyptian Book of the Dead, you will see people who are occupying a VERY VERY different moral universe from us Platonized Christianized (that includes the secular children of "modern orthodox" Jews) post-Enlightenment Westerners.

In terms of realistic psychology fitting neither the SSSM nor the Evolutionary Psychology brand (which you really should spend more times reading non-leftist criticisms of), the Muslims who flew planes into the World Trade Center undoubtedly saw themselves as heros, but in some sense that we would have a VERY hard time empathizing with or relating to. They are NOT a mirror reflection of ourselves, but genuinely something that has to be understood with empiricism, not empathy and wishful thinking.

I like your information, but I disagree with your conclusion. I don't think it is beyond the reach of empathy to understand them as thinking of themselves as heros. Steven_Bukal and TuviaDulin make very persuasive arguments, above. Years later, I admit, but think I remember detecting some empathy for the bombers at the time. Because I was looking for it.

Empathy is hard. Cultures differ. We Americans (especially secular Americans?) really don't have a clue what it feels like to (for instance) feel an obligation to kill our daughters or our sisters in order to preserve our family honor. Some actions in the name of causes may be psychologically modular, but some really aren't. What's the parallel for honor killing? Pressuring one's schizophrenic philosophy post-doc son to go to law school where he thinks he'll be miserable for the bragging rights? Sending your kids to Hebrew School or Day Camps they hate because your parents made you do it? It just doesn't work. Even within a culture, I have no idea what it's like to identify with a sports team and very few people can relate to the horror that I feel at some Psychological data or philosophical ideas. You once pointed out that most of us can no longer even understand why the Psycho shower scene was once considered terrifying. I would recommend Silvia Plath's diary for what are to me stranger attitudes than those.

You once pointed out that most of us can no longer even understand why the Psycho shower scene was once considered terrifying.

Why was that? I can't find it with a search.

Are you breaking your advice to not use contemporary politics in examples?

I doubt it's going to be very controversial that the 9/11 attacks were morally bad. (Though it might be interesting if someone is bold enough and contrarian enough to argue that they were justified?)

Sounds like someone needs to examine his bias re Alabama bar patrons.

Eliezer, it seems to me that you can and should shift your probability estimate because it makes you feel proud. Of course you might do even better than that by recalculating your reasons, but that approach will often not be cheap or reliable.

What would falsify that model of belief as attire?

Pseudonymous, I confess that it is only a guess, just a more plausible guess than the American one. And Jack the Ripper might well have been a monster - there are such things as sociopaths.

Robin, I'd say "recalculate your reasons" not "weaken your confidence". You can't literally shift a probability estimate because it makes you feel proud.

You can't literally shift a probability estimate because it makes you feel proud.

Why ever not? It feels as if a belief-forming robot probably shouldn't. But what if pride was its label for 'probable flaw in calculation detected'.

I thought this site would be the last place I'd see criticism of the "suicide bomber as cowardly" notion. Under some definitions, sure, doing something you expect to result in your death, in pursuit of a higher goal, necessarily counts as courage. However, it would be justifiable to say they are intellectually cowardly. That is, rather than advance their ideas through persuasion, and suffer the risk that they may be proven wrong and have to update their worldview; rather than face a world where their worldview is losing, they "abandoned" the world and killed a lot of their intellectual adversaries.

It is an escape. There is, after all, no "refutation" for "I'm right because I'm blowing up myself and you".

It's for the same reason one might apply the "coward" label to a divorced, jealous husband, who tries to "get back" at his ex-wife by killing her or their child. He, too, exposes himself to immense risk (incarceration, or if they defend themselves). He too, is pursuing a broader goal. Yet in that case, my calling him a coward is not an artifact of my disagreement with his claim that he has legitimate grievances -- in fact, I might very well be on his side (i.e., that the courts did not properly adjudicate his claim).

So yes, it might be the "American" thing to say terrorists are cowardly -- but that doesn't make the claimant biased or wrong.

Is that what extremist Americans mean when they say cowardly?

No, that's probably just belief as attire. My point is just that reasonable interpretations of "Suicide bombers are cowardly" allow the statement to be true, even if people don't mean the true version, or if they came to that conclusion for the wrong reason.

Welcome to Less Wrong! Feel free to introduce yourself on that thread. Don't hesitate to browse the recommendations from the About page or start in on the Sequences. Kaj_Sotala also posted a first and a second list of favorite posts, which are also quite good.

Your point is a good one - I don't know if you read The Bottom Line (or Rationalization, the followup), but they make a similar point in a well-phrased way you might enjoy.

Except that based on videos and letters left behind, the hijackers considered Americans to be not just intellectual adversaries, but wartime ones. I believe the majority of the hijackers cited American military presence in the Middle East and military and economic support of Israel to that effect.

So what were the specific arguments they used when persuading acolytes of the great satan that their position has more merit? Or was it confined to "BOOM!"?

My point is that using violence to silence intellectual adversaries is very different from using violence against a perceived wartime enemy.

Their ideology might be intellectually cowardly. But sacrificing your own life in battle against a perceived enemy is not a cowardly act. When people call the attacks cowardly, they're talking about the attacks themselves, not the worldview of the attackers.

I think most non-LWers who refer to the attacks as cowardly mean that they were conducted against unresisting, nonmilitary, targets. The people killed couldn't fight back (or at least weren't expected to fight back), and attacking someone who isn't expected to fight back is widely seen as cowardly.

In this case, of course, other aspects of the operation were hazardous to the terrorists even if they didn't expect anyone to fight back, but I believe most people who consider the attack as cowardly are treating these aspects separately.

I think actually you're a bit confused about the difference between instrumental virtues, like courage, and inherent virtues, like benevolence. (Which list "rationality" goes on is actually a tricky one for me. In a certain sense, Stalin seems terrifyingly rational.)

I guess we could talk about "intellectual courage" versus "physical courage" or something like that, and your argument is that these men were not intellectually courageous. But usually when people say "courage" simpliciter, they mean a willingness to act in spite of a high risk of pain and death. And this the hijackers definitely had!

Indeed, there's something truly terrifying about the Al Qaeda hijackers: They were mostly right about their moral values. They were altruistic, courageous, devoted to duty. It's only this very small deviation---"maximize deference to Islam" instead of "maximize human happiness"---that made them do such terrible things.

This also meshes with what we know about the Milgram and Zimbardo experiments; quite ordinary people, if convinced that they are acting toward a higher moral purpose, will often do horrific things. The average Nazi was not a psychopath, not a madman; he believed that what he was doing was right. And this should be the most chilling fact of all.

I suspect that the Muslim hijackers, in a strange way, thought they were maximizing human happiness by removing Americans from the world.

I think it more likely they thought they were doing the will of Allah. Happiness? Happiness is for pigs.

I think it more likely they thought they were doing the will of Allah. Happiness? Happiness is for pigs.

Well that explains the no bacon and pork rule.

I used to assume (possibly through overapplied principle of charity) that the accusations of cowardice had to do with their "escaping" the consequences of their actions by dying, especially if they anticipated heaven.

Specifically, I wonder how comparatively scared they'd have been at the prospect of:

  • Surviving through being captured and extrajudicially detained
  • Surviving through being captured and subjected to a nationally televised trial
  • Destroying the towers, but somehow surviving long enough to be trapped in the wreckage with a dying Muslim girl who has no idea what's happening
  • Being given a teleporter they could use to escape just before the impact, knowing that each of their compatriots had refused the same offer
  • Being ordered to destroy the towers by firing a super rocket launcher in broad daylight in full view of bystanders
  • Being ordered to destroy the towers with remote explosives, then return to their normal lives with only themselves to know they'd helped kill thousands of people.

Bob: Great post.

Eliezer: I was not saying anything cannot be understood, but rather that using our specialized "empathic" capabilities for understanding human behavior in terms of our own hypothetical behavior is counterproductive to understanding many instances of human behavior when the humans in question are from different cultures or otherwise very different from one's self. It's easy to model it, possibly even to model it well (Chronicles of a Death Foretold by Gabriel Garcia Marquez tries to), but next to useless to model it by reference to your own feelings. If we couldn't model it at least somewhat we couldn't even form the concept and talk about it, but we don't have the special advantages here that we have when modeling a hungry person eating or the like.

For a less politically charged example, fairly young (7 or 8, maybe 9?) could learn to model sexual desire, but they can't empathize. Adults cannot empathize with a child's enjoyment of TV shows targeting young children even though they have been children and may remember watching the same shows and enjoying them. Actually, since enjoyment is virtuous, that example answers your question, though across ages rather than across cultures. For 'virtuous', cross cultural, and impossible to empathically model (ignoring that to some degree there is a contradiction here, as without the ability to model the state it is hard to be confident of its intrinsic virtuousness, only of the virtuousness of the actions it brings about, which could also have been pursued for utilitarian or other deeply generally human reasons such as caring. The symmetry with abhorrent actions is broken in this respect) there are surely many different types of meditation or other altered mental states, enjoyment of a vast number of foods (I can understand liking kumiss via a "comfort food" schema, but not *kumiss AS kumiss), entertainments, and art forms, and probably more subtle and general feelings having to do with attachment to the land, etc, though I can model these empathically to some degree.

Most generally of all, I already had given examples, in citing Haidt's 5 moral domains.

Hmm. Criticism of Haidt's theory. Haidt, and most other people, probably see conservatives and liberals as having equal lack of understanding of one another. (the point of his theory is that he lacks such empathic understanding, hence the need of an empirically derived theory). However, his theory suggests that conservatives should easily understand liberals. The magnitude of one's disagreement shouldn't be the cause of empathy failure. Rather, empathy failure should follow from the apparent pointlessness of the action being criticized. For instance, it's probably easy for us to empathize with Joseph Mengele's actions while still strongly disapproving, as scientific curiosity is a shared motivation and the difference between his actions and actions we would approve of is because of his not applying enough weight to caring/sympathy considerations that we consider important. The opposite is true of one's experience reading about Isaac Bashevais Singer's father in "My Father's Court" or other good works of anthropology dealing with rich cultures. We are bemused by the apparent pointlessness of all of the ritual details that this silly man takes so very seriously, but he is harmless and we are not indignant at all.

In terms of humanly realistic psychology, the Muslims who flew planes into the World Trade Center undoubtedly saw themselves as heroes defending truth, justice, and the Islamic Way from hideous alien monsters a la the movie Independence Day. Only a very inexperienced nerd, the sort of nerd who has no idea how non-nerds see the world, would say this out loud in an Alabama bar.

I read this three times. First pass: What? Why? Maybe I missed something. rereads Second pass: Oh, would they not get the reference? But why would that be so bad? rereads Third pass: It's certainly plausible that it's severely overstating it to say they think of us as hideous alien monsters; I can think of other religious feelings that could lead to that level of bravery and dedication, so such a statement might make me seem insensitive and horribly ignorant. But I'm pretty sure random people in an Alabama bar wouldn't recognize that, so I'm still confused. reads on

Well. That's not good news for me is it?

Eliezer's probably saying that the patrons of said Alabama bar would be, shall we say, highly unlikely to appreciate the neutral point of view, probably due to ingroup biases. It's the arguments-as-soldiers thing again, and you're implicitly putting yourself on the wrong side.

I've never been to Alabama myself, so I don't know whether this is actually true or not. I suspect it wouldn't be as bad as he's implying (it might start an argument, but I wouldn't expect a fight), but that might be my optimism acting up.

Yes, I understood as soon as I read the next sentence. I just felt silly that I couldn't figure it out myself.

I've never been to Alabama myself, so I don't know whether this is actually true or not. I suspect it wouldn't be as bad as he's implying (it might start an argument, but I wouldn't expect a fight), but that might be my optimism acting up.

Maybe it's just because I'm a New Yorker, but trust me that you don't have to cross the Mason-Dixon line for people to be willing to sock someone who said something even remotely positive about the 9/11 hijackers. Things have cooled down a bit in the last twelve years, but there are still some things you just don't say. Or imply, in this case.

I know that I would personally have trouble restraining myself if someone expressed actual support for, or tried to equivocate-away, the crimes of terrorists in my presence. It's absolutely an issue of tribal loyalty, and not even entirely irrational; expressing empathy for an enemy weakens your resolve against them, which is not a particularly wise choice when the only way our tribe can lose is by giving up.

It seems pretty obvious to me that your tribe can also lose by directing its energy in the wrong direction, resulting in harms to yourselves. As, for example, has already happened with TSA, so I hear. (This doesn't mean "the terrorists have won" but it does mean you have lost.)

We're discussing people's emotional reactions to these types of statements and why they feel those emotions.

I pointed out that those reactions are typically strong and negative (and not just in Alabama), and that holding them is instrumentally rational.

Since this isn't preventing me from updating on any evidence presented (I absorbed the "everyone is the hero of their life story" moral years and years ago), I don't see that I'm particularly mind-dead in this scenario.

I saw mind killing in the particular phrase:

the only way our tribe can lose is by giving up

I also have doubts about that instrumental rationality.

My reasoning is... well it's hard to explain without going 100% RL politics, which is as rude as it is counterproductive. Basically there's different schools of thought on the strategy involved in asymmetrical warfare and I tend to come down on a particularly unpopular and effective side of the debate. That's all I'm willing to say in public.

In terms of instrumental rationality, it's pretty simple; being part of the winning team is generally useful, cheering and wearing the colors shows people you're on the team, and you cheer a lot more enthusiastically when you actually believe it. Cognitive dissonance gets a bad rap, but it really is a lot easier to compartmentalize than to maintain a lie long-term.

being part of the winning team is generally useful

True. However cheering for your team while dehumanizing your opponents is often a poor way to make your team stronger in the long run. Labeling someone a terrorist diminishes your desire to understand their motivations and eventually mitigate further terrorism. Instead one ends up supporting Iraq war-style mission creep resulting in the needless deaths of those on your team.

In terms of instrumental rationality, it's pretty simple; being part of the winning team is generally useful, cheering and wearing the colors shows people you're on the team, and you cheer a lot more enthusiastically when you actually believe it.

One thing is for certain: there is no stopping them; the ants will soon be here. And I for one welcome our new insect overlords. I’d like to remind them that as a trusted TV personality, I can be helpful in rounding up others to toil in their underground sugar caves.

I've never been to Alabama, but as I understand it the cultural climate in Alabama shares certain key characteristics with that in rural Massachusetts.

Were I, in a rural Massachusetts bar, to make any public statement to the effect that the individuals who flew planes into the WTC could plausibly be seen as heroes, or that they were comparable in any way to American soldiers fighting and dying for American interests (1), I would expect the locals to view this as a challenge to sacred virtues and to react accordingly.

I would not expect this to necessarily cause a fight (though it depends on how I went about it, and whether and how I backed down when those virtues were upheld by those around me); it wouldn't even necessarily get me asked to leave (though that's more likely, especially if I continued to defend that position).

(1) Edit: on further thought, I suspect that just talking about U.S. soldiers fighting for "American interests" (as opposed to "American values" or "America" or some such thing) would raise a suspicious eyebrow or two, as it superficially pattern-matches to a particular mid-1900s stereotypical formulation of Communist propaganda.

All curiosity exists to destroy itself; there is no curiosity that does not want an answer.

Vassar, it seems important to you that you not be able to understand certain acts - a badge of pride. I don't think I'm having trouble understanding an honor-killing. Someone else rapes your sister, it stains the family honor, she has to die, QED. It's not the way I think, but that doesn't stop me from modeling it.

In proof of this, I ask you, what virtuous mode of thought, or even mode of thought that you are not particularly indignant at, do you think yourself unable to understand across cultures?

"Eliezer's characterization describes a large minority of Americans very well."

All I see there are familiar platitudes, not a description of anybody who thinks about things. All I see, in fact, are familiar formulas employed by politicians. Nor are the formulas necessarily wrong. It should not be hard to see what is cowardly about most terrorist attacks.

American Heritage has a fairly good definition of cowardice: "Ignoble fear in the face of danger or pain."

The ignobility is an important factor which other dictionaries tend to miss. But American Heritage misses something that Cambridge has: "a person who is too eager to avoid danger, difficulty or pain"

It does not have to be danger and pain, it can be difficulty. So in a nutshell, a coward is someone who commits a discreditable act in order to avoid a difficulty (which might be pain or danger but might be something else). In the case of terrorists, the discreditable act is an attack on civilians, and the difficulty thereby avoided is the difficulty of engaging the enemy's armed forces. Similarly, it is cowardly to break certain Geneva conventions, for example disguising yourself as, and mixing with, civilians, to thereby shield yourself from the enemy, is cowardly, because you are committing a discreditable act (using civilians as shields) in order to avoid difficulty (greater exposure to enemy fire).

I will quote an old essay on the topic and answer some key points.

"Perhaps the idea is that it is cowardly to make a sneak attack, especially on a defenseless civilian target, rather than confront an armed enemy face to face. But no one seriously expects Osama Bin Laden to invite the 101st Airborne to fight his terrorist organization on equal terms."

The first sentence is a fair summary of the point I just made, but the second sentence is no answer. Compare the above with the following:

"Perhaps the idea is that rape is forcible sexual intercourse. But no one seriously expects Ugly Albert to get sex any other way than by forcing the girl."

The fact that the only possible way to succeed is discreditable or illegal or immoral, is no answer to the point that it is nevertheless discreditable or illegal or immoral. It is still what it is, even if it is the only way. If the only way to make a mark is a cowardly way, that makes it no less cowardly.

"And besides, the reason we usually consider it cowardly to make a sneak attack is because the attacker avoids facing the consequences."

Not necessarily. As the Chambers dictionary correctly recognized, what is necessary is an avoidance of a difficulty. It does not have to be specifically "facing the consequences".

So it should be fairly easy to see that it is not incorrect to say that the terrorists are cowards. It is furthermore, then, not incorrect to say that if someone says the terrorists are not cowards, then he is wrong.

But backing up, even though I have defended the familiar platitude that terrorist attacks are cowardly, nevertheless I do not think this accurately reflects man in the street thinking on the topic. Rather, it represents an old political formula that has caught on and that hardly makes a ripple. It's about as meaningful as saying "good morning". It is not significant to say it; it would be significant to stop saying it. Same as "good morning." We say that in order to avoid doing anything significant. It is a distant cousin of the "dead metaphor" - a metaphor that has lost its force through overuse. But like the dead metaphor, its overuse does not mean that it is not valid.

Setting this aside, there is also the matter of the habit that some intellectuals have of shocking the bourgeoisie. If you say that the bad guys think that they're the good guys and we're the bad guys, you probably won't raise any eyebrows. But if you make the statement in a way that implies that you agree with the bad guys' assessment, or that you are positioning yourself as a neutral party who favors neither side, then you will probably raise some eyebrows. And based on my own experience, an awful lot of people like to present the rather familiar and tired and unremarkable view that the bad guys think that they're the good guys, in just such a way, so as to maximize their effect on their listener. This seeming undercurrent of support for the enemy is something that can be easily avoided without changing the factual content of what you're saying, but it is in my experience often not avoided, indeed, it seems to be sought out and nurtured. And then, when the predictable reaction occurs, like clockwork Mr. Epater-les-bourgeois loudly complains about the impossibility of making obviously true statements in front of the the foolish masses.

I agree with your point about "difficulty of engaging the enemy's armed forces". But I still understand the frustrations of suicide bombers, because of the difficulty of significantly or meaningfully engaging some enemy's armed forces. Especially if you respect warriors, but not their guidance.

What is the brave action to take in that case? Simply suicide, and not suicide-attacks? Or better-targeted suicide-attacks? I am befuddled.

I am far more comfortable condemning suicide-attacks as irrational than cowardly.

I think questioning the Alabama bar analogy is useful within the context of this post. Whose attire is a belief in the value of giving primacy skepticism, critical thinking, etc.? According to Eliezer's performance in the OP, it's not the attire of either Alabama bar patrons or "muslim terrorist suicide bombers" -and both of those may signal more generally, the losers of the American Civil War and non-white brown people. In short, I think there may be a gentrification of critical thinking: it's reserved for an in-group, perhaps in particular northeastern anglo-saxon and ashkenazi jewish male intellectuals, or an even more narrow archetypal definition that might be reducible to zero actual people. I'm interested in the degree to which our behavior might be governed by aligning with and contesting these archetypes. Including which beliefs as attire to wear (it's perhaps an archetype alignment for Steven Hawkings and Richard Dawkins to claim to be skeptical about religion. It would probably not be an archetype alignment for Oprah to publicly wear such belief attire, even if in fact she was a crypto-skeptic).

This post may meander a bit but I think Eliezer's post (and some of the criticisms of it) are thought provoking and may be getting us closer to a more real world, real time model of how bias and belief is operating in the world we live in.

I know if I were in an Alabama bar, and the conversation turned to how "terrorists hate our freedoms", I'd certainly phrase things such that they didn't contradict what everyone in the room was yelling about.

Bonus points if I were clever enough to disagree with them in a way that seemed like I was agreeing with them.

Either way, I'd be wearing a belief I most certainly did not actually believe and did not in any way believe I believed, and I would do so entirely for my own preservation.

Bonus points if I were clever enough to disagree with them in a way that seemed like I was agreeing with them.

If you don't transmit your disagreement, why bother expressing it? Outwardly agreeing with them would accomplish the same thing with less effort.

One reason is because dog-whistles can work: I have from time to time had the experience of expressing my opinion about a subject in a way that causes the minority who agree with me to recognize me as a potential ally without triggering reprisal from the majority who disagree with me.

Another reason is to preserve some credibility in case of a future discussion where I'm more willing to deal with the consequences of public opposition. Rather than having to say (for example) "Well, yes, I know I said policy X was a good idea, but I didn't really mean it; I was lying then, but you should totally believe me now because I'm totally telling the truth" I can instead say (for example) "I said that policy X is an efficient way of achieving goals Y and Z, which it absolutely is. But I don't endorse maximizing Y and Z at the cost of W, which policy X fails to address at all."

Yet another reason is to use plausible deniability as a way of equivocating, when I'm not sure whether to come out in opposition or not. That is, I can disagree while maintaining a safe path of retreat, such that if the degree of reprisal I get for disagreeing turns out to be more than I feel like suffering, I can claim to have been misunderstood and thereby (hopefully) avert further reprisals.

One reason is because dog-whistles can work: I have from time to time had the experience of expressing my opinion about a subject in a way that causes the minority who agree with me to recognize me as a potential ally without triggering reprisal from the majority who disagree with me.

That already goes by the name "politician-speak".

It's being more honest with yourself and your own beliefs, though it certainly isn't more honest with your fellow bar patrons.

If you have a thing against lying (and I do), it's the lesser of two evils.

The inspiration was from professor Robert Thornton of Lehigh University, who came up with a creative way to write student "recommendations" that, if read literally, said quite directly that hiring this particular student was a very, very bad idea. If read figuratively, however, they sounded like glowing reviews, and indeed if you were expecting a good review you would think it were an absolutely wonderful review.

This was necessary because as a professor he was obligated to give students recommendations for their employers, but negative reviews have resulted in serious lawsuits in the past. Unwilling to compromise his morals, he got very creative with the English language instead of lying.

In that case, the reviews weren't meant for the student to ever see, but that is often unavoidable. He certainly did hope that the student's potential employer was capable of reading between the lines and comprehending the message.

He called his system L.I.A.R., if you want to search for it. They are pretty funny, and really do sound like positively glowing reviews until you look at exactly what they are actually saying.

HA: It seems to me that you think I have changed the topic. I agree with all of the sentences of your most recent comment, except for the first, but they don't seem to be about what I was saying.

Likewise, I agreed with Eliezer's post, but I thought that his analogy was, well, lacking in appreciation of the difficulties involved in analogy.

Basically, I think that Douglas Hofstadter's writings on the difficulties of natural language translation, the proper translation of literature, etc, are all in relevant to the issue of the translation of inferred emotion. Evolution provides a scaffold for our brains to develop, biasing us, strongly or weakly in certain directions, sometimes strongly enough that translation is at least almost always possible (maybe not with the Piraha?) and we can speak of human universals, and sometimes weakly enough that we can talk about Liberal insensitivity to 3 of Haidt's 5 domains of morality. When the biases are strong or when atypical emotional mixes emerge and propagate memetically, empathy ceases to be useful and "the other" must, in some respect or another, be understood empirically, e.g. without the benefit of our specialized social processing capabilities. In such cases, our anticipation suffers, but it suffers even more if we force bad analogies and continue to use our social processing capabilities in inappropriate circumstances.

HA: I chose my examples carefully to to try to match as closely as possible as many of the categories, relationship types, motivations, etc as I could, and the examples I came up with are both pervasively American and truly ugly from my perspective in the closest way that I could think of (matching type of motive, e.g. content of emotional state, not degree of emotional state or degree of ugliness) to honor killings. My point was that we don't have any very close matches. Your examples still don't match the intensity of honor killings, but more importantly, the emotional quality is utterly utterly different. Anger, retaliation, maintenance of public order, prevention of repetition, deterrence, The motivations for execution are simple and easy to understand, as is the balance calculation which compares the costs and benefits implicitly, even if a more careful calculation would disagree. At the most visceral level, honor killings are not in retaliation for some harm nor are they motivated by preventing a harm. This article is probably worth looking at for everyone who is still reading this thread, by the way.

By the way Honor killings != state sanctioned killing.

Constant, this blog has warned against the genetic fallacy before. What do you think would be a good characterization of the "other side"? Eliezer's characterization describes a large minority of Americans very well. (He's clearly not intending it to be descriptive of everyone who thinks Islamic extremism is a serious threat, if that's what you're thinking.)

I am in the process of working through these delicious posts so apologies in advance if my comments are redundant.

Perhaps group membership of a mutually supportive tribe has the greatest value (for example from both a psychological and survival perspective). If this is the goal, what is the most rational course of action? Will a rational person inevitably run into problems where the tool they are using to solve their problems becomes their primary source of problems?

I like this site for the very reason that it represents a community where my natural problem solving inclinations are not compromising my sense of being similar to those I interact with. But as with all communities I step with trepidation for fear of violating a social taboo which may be rationalised but is not reasonable (belief as attire). If we choose to be irrational because rationally we have decided it is the most rational course of action are we still rational?

Can we truly choose to be irrational, though? Recognizing the irrationality of a belief, and valuing reason, the most we can do is act as if we hold others' irrational beliefs. I'm sure there are many people who have done this throughout time; the tragedy is that each of these people may have "come out" as nonbelievers if they were aware of the others' presence.

While I personally think that a person compromises his integrity when he acts contrary to his beliefs, there are certainly many instances in which this course of action has survival value, and so can be said to be rational.

Using drugs, we could probably make ourselves less rational on purpose. Drinking lots of alcohol, for instance.

Actually, I think that much of including the current one can be though of as an enumeration of feelings that people who aren't either quite young or quite nerdy have not analogues to. Philosophy is full of others, such as existential despair and satori.

xkcd's archive page doesn't include dates. According to, the current comic as of 05:58:11AM (whatever time zone) on 05 August 2007 (a Sunday) was Tesla Coil, so I'm guessing you were talking about the previous one, Lisp Cycles.

The biases of Rationalists are showing in this article.

It's peculiar to have a sequence on Korzybski's "The Map is Not the Territory" followed shortly thereafter by a post making a purely intensional distinction between "proper" and "improper" beliefs.

What's "improper" about achieving in group identification? It's often quite handy. Casting intensional aspersions on all values we might derive from our beliefs but the predictive utility does not strike me as rational.

I think that the word improper is being used in the post in the same way that mathematicians use it in the phrase improper integral.

This usage is not pejorative, but marks a delicate extension of the basic concept. In mathematics the delicacy arises from the need to take a limit, which might not exist. In the case of an improper belief the believer is opened up to conflicts, perhaps because they belong to multiple groups with conflicting identities, perhaps because predictive utility competes with group identification in practical importance.

But this is one of my issues with what I have seen at lesswrong - the privileging of predictive utility over other forms of epistemic rationality over instrumental rationality. Epistemic rationality is another form of instrumental rationality, but where rationalists gather, it gets privileged as if it were the only true rationality, or at least a better rationality. It's a mistake, and really impairs the ability of rationalists to understand other people who do not privilege epsitemic rationality to the same degree, if at all.

You say improper is not used in a pejorative sense, but clearly the normal usage of "improper" is pejorative. And when an epistemic utitlity competes with another instrumental utility, why doesn't that equally make the epistemic utility improper?

Further, the non-epistemic beliefs are described as

but the other forms are arguably "not belief at all".

TIme and time again, epistemic rationality is set up as the real, better, higher, truer, shinier rationality.

Just to be clear, I'm not here to trash the idea here. I came to the site from reading EY's Harry Potter fan fiction, which is just awesome and I've dying for the next chapter. Between the book, and the sequences, I'm busy reading a guy making all my arguments and more, reading many of the key books I read years ago in graduate school. Korzybski and Jaynes are at the top of my pantheon (with Stirner, who I don't see a lot of influence from). So I'm here because of some very specific and fundamental shared methodology.

I don't say "me too" to all that I agree with, unless it is something new to me or I have a refinement to add. But on this point, I see privileging of epistemic rationality, and I think it's a mistake.

You would put instrumental rationality above epistemic rationality?

So if it makes me happy to believe the Moon is made of cheese, I ought to do so?

You would put instrumental rationality above epistemic rationality?

So if it makes me happy to believe the Moon is made of cheese, I ought to do so?

If making yourself happy is, all things considered, what you want to do. (And then assuming that said belief modification is the most effective way to gain happiness.)

You would put instrumental rationality above epistemic rationality?

I put winning above predictive accuracy, yes.

As fate would have it, the article What do We Mean by Rationality is the page that comes up in my chrome browser when I type "less"

It's a peculiar article, because it gives two concepts as a definition for rationality, Epistemic Rationality and Instrumental Rationality, where clearly the concepts are not identical. And yet all sorts of statements are made thereafter about Rationality without noting the difference between the two concepts.

To answer you question in these terms, for all beliefs where the Instrumentally Rational belief is X, and the Epistemically Rational belief is NOT X, I'd rather believe in X. I'd rather Win than Correctly Predict, where I have to make the tradeoff.

In all fairness, I think Islamic fundamentalists really do hate our freedom. They hate our entire way of life, and this freedom is a part of that.

Hating the freedoms of western society doesn't preclude one from committing brave, selfless acts, though. Unfortunately for us.

To paraphrase, there's a difference between resenting someone for having freedoms that you do not, and disliking the concept of "freedom". And these get mixed up on occasion.

Clearly they DO hate our concepts "freedom of religion" and "freedom of speech". (They will explicitly say so!) There may be some freedoms that they would value... though actually maybe not. Maybe they value deference to Islam so highly that any kind of individual freedom would entail the freedom to violate Islam and therefore be evil.

I appreciate the effort to sort out "improper beliefs". As a philosopher with a background in distinguishing surface-level propositions from speech acts with goals that may be masked by those propositions as such, I am inclined simply to say that "improper beliefs" are NOT beliefs. I prefer reserving "belief" for the anticipatory dispositional beliefs that you call "proper".

This is so far just a semantic difference, but the real difference comes out when you say that people have to "convince themselves they are passionate". From my perspective, no such "convincing" is necessary when a person moves from literal to nonliteral interpretations of mythic language, because the esoteric perspective can be as exciting and full of significance as the exoteric. People can be passionate about the real, positive benefits of religious practices: psychological well-being, social connectedness, aesthetic sensibility, self-respect, etc. Discovering these benefits as the real meaning of myths can be as eye-opening as the adoption of a counterfactual, mythic perspective.

But most people clearly DON'T treat these things as meaningless speech acts.

How do I know this? Because if you say something like "Right, because that's just a meaningless speech act" in response to some absurdity of religion like "virgin birth" or "transsubstantiation", people will get VERY ANGRY at you. They will not respond as though they are playing a game of words, they will respond as though you have accused them of lying. And if improper beliefs are precisely non-beliefs trying to make themselves look like beliefs, then you HAVE just accused them of lying.

The only way this comment makes sense to me is if I assume that you believe that (for example) humans reliably fail to become angry when their tribal attire is challenged, unless that tribal attire also happens to be a meaningful belief.

Do you in fact believe that?

If so, can you expand on your reasons for believing that? It seems implausible to me, and inconsistent with my observations of human behavior.

The only way this comment makes sense to me is if it was written without reference to its grandparent.

Bob wrote "The person who calls himself a global warming skeptic... after reading a couple of books and a few articles arguing for [such skepticism] will often acknowledge that if he'd started by reading books advocating alternative views, then he would not have come to be a global warming skeptic..." This is one mechanism, but sometimes positions just "feel right" to people, i.e. in agreement with their predisposed visions, or traits.

Also it seemed to me that by asking of people that they examine as many arguments opposed to their view as they examine in alignment with their view, you would also be demanding a similar objectivity from scientists. But as has been said often, scientists are only human. They pursue their hunches (conjectures); and natural selection knew what it was doing when it made all of us normally tend to do the same.

This is not to strongly discount a goal of overcoming bias, but is to confirm a point doubtlessly made here before, that not only does bias exist for a reason but can in many instances be optimal for achievement or survival. Admittedly, truth seeking and achievement may be at odds with one another at times.

Re: the Alabama bar, when that same criticism was leveled by Neil Young, the response was, "A Southern man don't need him around anyhow". Apart from the fact that it came in the form of hit song, the reply is notable in that it's not something along the lines of, "them's fightin' words!" Though you may be right about the South's religious and political attitudes, I think you misunderstand how and when violence is used in that culture.

Anyway, back to the issue. The mindset of Mohamet Atta, et al, was elegantly described by Eric Hoffer in The True Believer. I don't believe it takes any unusal emotional insight to understand Atta's psychology, if it's seen in those terms.

Ignorance. I may think I understand their minds, but that does not prove that I do understand their minds.

All you know is that you have a mental model of their minds which seems credible to you. Have you tested this model, and if so, how?

All I am reasonably sure of is that they did not see their act as evil and cowardly. Doubtless the same was true of Jack the Ripper and the Boston Strangler, but that tells me nothing about the differences between them and everyone else. After all, I only think that is true of them is because it seems to be true of most people.

It is really just an assumption.

That's really an interesting question. What about the ones who really ARE insane psychopaths? Do they think they are doing the right thing, or do they really just not care?

I'm inclined toward the latter, actually. I've read some of Stalin's journals, where he says things like "What is efficient is good. What is inefficient is bad." That sounds like he literally doesn't understand what morality MEANS to ordinary, non-psychopathic people.

Sounds like he still needs to classify things as good and bad, though. Maybe a previous moral framework has collapsed and he's looking for an alternative? Any idea whether he had goals beyond 'survive'?

Their belief or cowardly is not the problem. We must be concerned about their expected behavior. The rest is a commentary.

Konrad: Not to repeat myself yet again, but no, understanding psychology never requires unusual emotional insight. It takes analytical ability, but it gives a different type of understanding from that which emotional insight gives.

Bob, I take it you're not the deceased kiwi atmospheric scientist Robert "Bob" Unwin. But very high quality commentary. I hope that you start an blog to consolidate your observations under this name/pseudonym (as I have done with mine).

Michael, how about the point that you're (rather explicitly now) picking a point upon which to manufacture in-groups and out-groups. In-group: those of us who get motivations for execution. Out-group: those who get honor killings.

The in-groups and out-groups change if the point to get is abrahamic monotheism, or if the point to get is state-sanctioned punitive killings. It seems to me that you're picking one that's particularly salient either to you or to what you imagine your audience to be. I think this gets to the belief as attire/beliefs as cheers for teams. It's an attempt to pick teams, but I think the implied in-groups and out-groups are at least in theory contestible.

A bit tangentially, I think teams themselves can be an effective (the most effective?) way to construct hierarchical privilege. The people on the field vs. the people on the bench (or the people regulated to the audience) of the two teams.

In terms of overcoming bias, I think understanding and when necessary countering these phenomena is important primarily to the degree that they warp decision-making or increase economic waste/existential risk.

Michael, I think your example is interestingly rooted in an implied in-group/out-group construction that construction Americans in a flattering way. Consider that you contrast honor killings with forcing kids to go to law school or day camp -that won't necessarily result in their death. It's a flattering contrast that I think constructs America as Western and honor killers as culturally Middle Eastern. But, if we contrasts cultures that approve of state-sanctioned killing of people for moral transgressions, America and the nations of the honor-killers are now in the same group, with Western Europe (and much of the rest of the world) in the other group. Incidentally, I'm not opposed to state-sanctioned killing, but I think it would be more rational for the penalty to start with doing it to to individuals to the extent it will prevent future great economic waste/increase in existential risk, rather than to punish premeditated murder of a small number of people or purported extramarital/premarital sex.

Hopefully_Anonymous: You seem invested in labeling people as using "muslim terrorists" for in-group/out-group construction, and I think it's coloring (biasing?) your analysis.


Silas, My opinion: you seem invested in using "muslim terrorists" for in-group/out-group construction, and I think it's coloring (biasing?) your analysis.

Michael, great criticism of an element of Eliezer's post.

Eliezer, Brilliant post, in my opinion. Clarifying and edifying. I'm looking forward to where you're going to go with this analysis of how bias and belief operate.

Eliezer, your lack of familiarity with "the other side" on the topic of terrorists is all too obvious from your crude attempt at a characterization of it. All you appear to know about it is a few platitudes. Often I get the feeling from this site that it is not so much about overcoming your own biases as it is about coming up with new excuses to dismiss views that you don't agree with by applying the genetic fallacy over and over (e.g. "suchandsuch belief is a product of suchandsuch bias").

it would be like dressing up as a Nazi for Holloween.

"So remember kids, dressing up like Hitler in school isn't cool."

Bob, a very high quality comment, but at 800 words it is too long for a comment. Please everyone, let's try to keep comments under about 400 words - longer items should be their own post or essay somewhere, which you can of course link to in a comment.