(I originally tried to make this a comment, but it kept on growing.)
I was looking through the Google results for "Less Wrong" when I found the blog of a rather intelligent Leon Kass acolyte, who's written a critique of our community. While it's a bit of a caricature, it's not entirely off the mark. For example:
Trying to think more like a mathematician, whose empiricism resides in the realm of pure thought, does not predispose these 'rationalists' to collect evidence from the real world. Neither does the downplaying of personal experiences. Many are computer science majors, used to being in the comfortable position of being capable of testing their hypotheses without needing to leave their office. It is, then, an easy temptation for them to come up with a nice-sounding theory which appears to explain the facts, and then consider the question solved. Reason must reign supreme, must it not?
How seriously do you take this critique? Do you wonder why I'm bothering with this straw-man criticism of Less Wrong?
Actually, I've deceived you; there's no such Leon Kass devotee. The quote above is a very minor adaptation of this Kaj Sotala post, which I've changed from the first person plural to the third person plural. Read it if you like, and then reevaluate the critique. (Yes, I know it's less coherent out of its actual context.) Does it seem to be less of a caricature now that you've read a version in which you identify with the writer, rather than one in which the writer is analyzing and criticizing you from outside?
Now I hope that this little trick (which people are starting to expect around here) caught your attention. We really do seem to react differently to an outside analysis of a person or group in very different ways, depending on whether we've been primed to identify ourselves more with the author or with the object of analysis. One might say that we strongly object to being modeled as a more or less predictable agent in someone else's scheme: that we instinctively reject any out-group analysis of our own personality and cognition. (Compare people's reluctance to trust the Outside View even when they know it's more reliable.)
- We can predict a stranger's extraversion scores on the Implicit Association Test quite well by watching them act out a one-minute commercial.
- We can improve our predictions further if we're first told about a few nonverbal tics to watch for.
- We're bad at predicting our own extraversion scores on the IAT, even if we're given the video of ourselves acting out the commercial and told about the nonverbal tics.
This screams out "blind spot" to me, and one with a nice evolutionary reason to boot: a blind spot toward one's own patterns of action would make it possible to sincerely promise something we'll probably fail to do, which is a pretty nice trick to have up one's sleeve in a social environment. (Anyhow, the cute ev-psych story can be discarded without incident to the rest of the evidence.)
If it's true that we instinctively react to an out-group analysis of our actions, what might we expect would happen when we're faced with one? The most likely reaction would be a knee-jerk dismissal of the model, with justification to be constructed after the fact; or perhaps we'd take offense. If that were so, then we might expect the following kinds of results:
- We often analyze our friends when they're absent, in ways that would offend them if they were present; and yet we don't feel we're being dishonest or unfair.
- Political groups have explanations for why other people believe the way they do, and no group ever accepts a different group's proposed explanation for them.
- An analysis of behavior that puts us in the group being analyzed, and the speaker outside it, never sits right with us. We can, however, consider it more openly if the speaker argues that this is true for a class including themselves; this, after all, is an Inside View.
And if a fourth obvious case doesn't occur to you, you must have been somewhere else for the past week.
Fortunately, it seems to me that a fix is available: if the analysis is set up so that the (intended) readers identify more with the analyst than with the objects of analysis, they seem to avoid that blind spot. (They don't have to share all the characteristics of the analyst to do that; I would bet that female readers didn't have a moment of difficulty identifying with Eliezer rather than the woman on the panel in this anecdote, where the implied divide was "rationalists versus irrationalists" rather than "men versus women".) Keeping your readers with you is usually not that hard to do in practice; it's what writers call "knowing your audience", and if you imagine delivering your statement to the proper audience, you should (subconsciously, even) do better at avoiding that split between yourself and them.
The key thing, though, is that readers don't seem to do this on their own, not even rationalists. This is not a failing of one part of this community or another; this seems to be part of the current human condition, and it behooves a good communicator to avoid implicit "Us/Them" splits that leave a good part of the intended readership in "Them". In particular, writing with more care on gender is worth the cost in extra words and thought: gender-specific pronouns really do seem to cause distraction and dis-identification with the author, and I'd predict that the difference between
most people here don't value social status enough and (especially the men) don't value having sex with extremely attractive women that money and status would get them
money and status would make a man more attractive to many women; men who really value a better romantic or sexual life should thus put more priority on money and status
is pretty significant to a female reader (please correct me if I'm wrong). In the first, it's generally just the male readers who can easily take it as an analysis from their perspective, while female readers identify themselves with the thing being (very crudely) modeled from outside. In the second, female and male readers can identify themselves with someone making an actual choice or observation, or equally well envision themselves looking at the whole situation from outside.
Therefore, I suggest that when your post or comment touches on a subject that divides the Less Wrong community into identifiable groups (transhumanism or PCT or libertarianism, not just gender), it's good practice to imagine reading your contribution out loud to members of the various subgroups, and edit if you feel it might go over badly. This goes double if you're analyzing a general tendency in a group you don't belong to. (ETA: Sometimes it might be necessary to go ahead and damn the torpedoes, but I think that on some subjects we're being far too lax in this respect.) On the other hand, if someone analyzes you or your group from outside (and, needless to say, gets it wrong), I'd suggest you show a little extra patience with them; neither of you need be exceptionally irrational/sensitive/insensitive for this kind of impasse to arise.
P.S. I've hung back from the Less Wrong Gender Wars for a while, in part because I wanted to observe it a bit before committing myself to a position, and in part because everything I had to say seemed wrong somehow. I finally started writing out a comment listing several hypotheses for how we could have a situation where one good rationalist feels that a way of speaking is clearly unethical (while not necessarily incorrect in substance), and another good rationalist appears to be, not just disagreeing, but actively mystified about what could be wrong with it. Then I realized that one of my hypotheses was much better supported than the others.
EDIT: At the request of several, I've stopped diluting the term "Outside View" and called this particular thing "out-group analysis."