My initial reaction to Combat vs Nurture was to think "I already wrote about that!" and "but there should be three clusters, not two". However, looking at my old posts, I see that my thinking has shifted since I wrote them, and I don't describe the three clusters in quite the way I currently would. So, here is how I think about it:

  • "Face Culture" / "Playing Team": When people offer ideas, their ego/reputation is on the line. It is therefore important to "recognize the value of every contribution" -- accepting or rejecting an idea has a strong undercurrent of accepting or rejecting the person offering the idea. This sometimes makes rational decision impossible; the value of team members is greater than the value of the specific decision, so incorporating input from a team member can be a higher priority than making the best decision. Much of the time, bad ideas can be discarded, but it involves a dance of "due consideration": an idea may be entertained for longer than necessary in order to signal strongly that the team valued the input, and downsides may be downplayed to avoid making anyone look stupid. Eventually you want someone on the team to point out a decisive downside in order to reject a bad idea, but ideally you coax this out of the person who originated the idea. (I called this "face culture", but I have heard people call it "playing team".)
  • Intellectual Debate: It is assumed that engaging with an idea involves arguing against it. Approving an idea doesn't necessarily signal anything bad, but unlike face culture, arguing against an idea doesn't signal anything bad either. Arguing against someone (genuinely or in a devil's-advocate position) shows that you find the idea interesting and worth engaging with. Concepts like burden of proof are often applied; one tends to operate as if there were an objective standard of truth which both sides of the debate are accountable to. This warps the epistemic standards in a variety of ways, for example, making it unacceptable to bring raw intuitions to the table without putting them in justifiable terms (even if a raw intuition is your honest reason). However, if you have to choose between face culture and intellectual debate culture, intellectual debate is far better for making intellectual progress.
  • Mutual Curiosity & Exploration: I called this level "intellectual honesty" in my old post. This level is closer to the spirit of double crux or circling. In this type of conversation, there may still be some "sides" to debate, but everyone is on the side of the truth; there is no need for someone to take one side or the other, except to the extent that they hold some intuitions which haven't been conveyed to others yet. In other words, it is more natural to weave around offering supporting/contrary evidence for various possibilities, instead of sticking to one side and defending it while attacking others. It is also more natural for there to be more than two possibilities on the table (or more possibilities than people in the conversation). People don't need to have any initial disagreement in order to have this kind of conversation.

Whereas Ruby's Combat vs Nurture post put the two cultures on a roughly even footing, I've obviously created a hierarchy here. But, the hierarchy swings between the two poles of combat and nurture. Ruby mentioned that there's a contrarian aspect to intellectual debate: the bluntness manages to be a countersignal to the more mainstream niceness signal, so that getting blunt responses actually signals social acceptance. Yet, Ruby also mentions that the culture amongst bay area rationalists is primarily nurture culture, seemingly aligning with the mainstream rather than the contrarian combat culture. I explain this by offering my three-layer cake above, with mutual curiosity and exploration being the meta-contrarian position. Although it can be seen as a return to nurture culture, it still significantly differs from what I call face culture.

I've said this before, and I'll say this again: placing these conversation cultures in a hierarchy from worse to better does not mean that you should frown on "lower" strategies. It is very important to meet a conversation at the level at which it occurs, respecting the games of face culture if they're being played. You can try to gently move a conversation in a positive direction, but a big part of the point of my original post on this stuff was to say that the underlying cause of different conversational practices is the level of intellectual trust present. Face culture is a way to manage conversations where people lack common knowledge of trust (and perhaps lack actual trust), so must signal carefully. Intellectual debate requires a level of safety such that you don't think an argument is a personal attack. Yet, at the same time, intellectual debate is a way of managing a discussion in which you can't trust people to be detached from their own ideas; you expect people to be biased in favor of what they've proposed, so you embrace that dynamic and construct a format where intellectual progress can happen anyway. The level of mutual curiosity and exploration can only be reached when there is trust that everyone has some ability to get past that bias. (Actually, double crux seems like a bridge between intellectual debate and mutual exploration, since it still leans heavily on the idea of people taking sides.)

Having established a meta-contrarian hierarchy, we can extend the idea further. This stretches things a bit, and I'm less confident that the five levels which follow line up with reality as well as the three I give above, but it seems worth mentioning:

  • 0. Open Verbal Combat: This is the "lower" level which face culture is a reaction to. Here, everyone's ego is out in the open. There is still a veneer of plausible deniability around intellectual honesty: arguments would be meaningless if no one respected the truth at all and only argued what was convenient to them in the moment. However, at this level, that's almost exclusively what's happening. Even in cases where it looks like arguments are being respected for their undeniable force, there's a lot of status dynamics in play; people are reacting to who they can expect to be on their side, and logic only has force as a coordinating signal.
  • 1. Face Culture.
  • 2. Intellectual Debate.
  • 3. Mutual Curiosity.
  • 4. Exchanging Gears: Once everyone has a common framework of mutual curiosity, in which exchanging intuitions is acceptable and valued rather than steamrolled by attempts at objectivity, then a further evolution is possible, which involves a slight shift back towards combat culture. At this level, you don't even worry very much about deciding on the truth of things. The focus is on exchanging possible models; you trust that everyone will go and observe the world later, and update in favor of the best models over a long period of time. Articulating and understanding models is the bottleneck, so it deserves most of the attention. I think this is what Ben Pace describes in Share Models, Not Beliefs. However, this shift is smaller than the shifts between levels below this one (at least, in terms of what I currently understand).

Again: the biggest take-away from this should be that you want to meet a conversation at the level at which it is occurring. If you are used to one particular culture, you are very likely to be blind to what's going on in conversations following a different culture, and get frustrated or frustrate others. Read Surviving a Philosopher-Attack if you haven't, and keep in mind that responding from combat culture when someone is used to nurture culture can make people cry and never want to speak with you ever again.

New to LessWrong?

New Comment
12 comments, sorted by Click to highlight new comments since: Today at 6:04 AM

I've been at a workshop and haven't had much chance to engage with this post. Thanks for writing it, it's an excellent reply and says many things better than I managed to. I especially like hierarchy which swings between nurture and combat, that seem well described to me. Also strong endorsement for meeting conversations where they're at.

I'm currently of the view that anything below level three is a complete waste of time, and if we can't find a way to elevate the faith level quickly and efficiently then we have better things to be doing and we shouldn't engage much at all (This is mere opinion, and it's a very bold opinion, so I encourage people to try to wreck it, if they think they can.)

I've avoided people/conversations on those grounds, but I'm not sure it is the best way to deal with it. And I really do think good intellectual progress can be made at level 2. As Ruby said in the post I'm replying to, intellectual debate is common in analytic philosophy, and it does well there.

Maybe my description of intellectual debate makes you think of all the bad arguments-are-soldiers stuff. Which it should. But, I think there's something to be said about highly developed cultures of intellectual debate. There are a lot of conventions which make it work better, such as a strong norm of being charitable to the other side (which, in intellectual-debate culture, means an expectation that people will call you out for being uncharitable). This sort of simulates level 3 within level 2.

As for level 1, you might be able to develop some empathy for it at times when you feel particularly vulnerable and need people to do something to affirm your belongingness in a group or conversation. Keep an eye out for times when you appreciate level-one behavior from others, times when you would have appreciated some level-one comfort, or times when other people engage in level one (and decide whether it was helpful in the situation). It's nice when we can get to a place where no one's ego is on the line when they offer ideas, but sometimes it just is. Ignoring it doesn't make it go away, it just makes you manage it ineptly. My guess is that you are involved with more level one situations than you think, and would endorse some of it.

Somewhat related to this, I sometimes do the following and would be interested in feedback on whether I'm coming across the right way. At the start, I'm curious why another person thinks what they think, and because I can't expect an instant answer if I post a question online, I try to guess first. If I'm not confident in any of my guesses after a while, I'll write down and post the question, and since I already have some guesses, I write those down as well in order to signal that I'm taking the other person seriously. But I'm not sure if I'm succeeding in this signaling attempt (e.g., maybe other people think I'm straw-manning them, or something else I'm not thinking of). Here is a recent example of me trying to do this. Feedback welcome on whether this (i.e., signaling by writing down my guesses) is a good idea, and whether I'm succeeding in my attempts.

since I already have some guesses, I write those down as well in order to signal that I'm taking the other person seriously.

I am not sure how others perceive such guesses in general, but in my experience I find them very helpful for avoiding the Double Illusion of Transparency, as they can reveal assumptions that either the reader or writer was making and that the other didn't know about.

I think it's usually a good idea overall, but there is a less cooperative conversational tactic which tries to masquerade as this: listing a number of plausible straw-men in order to create the appearance that all possible interpretations of what the other person is saying are bad. (Feels like from the inside: all possible interpretations are bad; i'll demonstrate it exhaustively...)

It's not completely terrible, because even this combative version of the conversational move opens up the opportunity for the other person to point out the (n+1)th interpretation which hasn't been enumerated.

You can try to differentiate yourself from this via tone (by not sounding like you're trying to argue against the other person in asking the question), but, this will only be somewhat successful since someone trying to make the less cooperative move will also try to sound like they're honestly trying to understand.

Let's call this process of {exposing our guessed interpretations of the other person's position}.. uh.. "batleading"

I wonder how often that impulse to batlead is not correctly understood by the batleader, and when people respond as if we're strawmanning or failing to notice our confusion and trying prematurely to dismiss a theory we ought to realise we haven't understood (when really we just want to batlead) we tragically lack the terms or the introspection to object to that erroneous view of our state of mind, and things just degenerate from there

This felt like a useful evolution of the Combat vs Nurture concept, and is in fact the model I am usually using when thinking about intellectual culture.

Very quick note to say that my current understanding is that this post mostly describes an orthogonal thing to my original post, an alternative dimension, rather than actually the same thing.

Interesting. Am curious to hear more at some point about what differences are salient to you. (I get that it's carving up the landscape in a different way. It seemed to me it carved up the same landscape in a way that felt more directly useful to me, since I in fact don't really want either a nurture or a combat culture)

This got elaborated on in the much less acclaimed sequel, but the key point that I settled upon after thinking and writing about the topic wasn't <here's the list of the different cultures> but the fact that a key part of any culture is how it interprets various statements/actions/expressions.

Nurture and Combat are just examples of cultures that apply a set of consistent interpretations based on a set of consistent priors.

Nevermind what the cultures are though, what's important to remember is that the same action, e.g. "bluntly disagreeing with someone you said" will be interpreted differently in different cultures. What is hostile and demeaning in one is friendly and respectful in another.

[And then a heap of stuff will be downstream of this. To the extent that culture-specific interpretations are automatic, reflexive, and deeply ingrained, you'll face hard cross-cultural conflicts. If you want to shift the culture, effectively you are trying to coordinate everyone to assign new meaning to old actions/statements/expressions, etc.]

Nod. I felt like I got that from this essay too, but I agree it's not front-and-centered as much, or explored as much.