"In a sufficiently mad world, being sane is actually a disadvantage"

– Nick Bostrom

Followup to: What is rationality?

A canon of work on "rationality" has built up on Less Wrong; in What is rationality?, I listed most of the topics and paradigms that have been used extensively on Less Wrong, including: simple calculation and logic1, probability theory, cognitive biases, the theory of evolution, analytic philosophical thinking, microeconomics. I defined "Rationality" to be the ability to do well on hard decision problems, often abbreviated to "winning" - choosing actions that cause you to do very well. 

However, I think that the rationality canon here on Less Wrong is not very good at causing the people who read it to actually do well at most of life's challenges. This is therefore a criticism of the LW canon.

If the standard to judge methods by is whether they give you the ability to do well on a wide range of hard real-life decision problems, with a wide range of terminal values being optimized for, then Less-Wrong-style rationality fails, because the people who read it seem to mostly only succeed at the goal that most others in society would label as "being a nerd".2 We don't seem to have a broad range of people pursuing and winning at a broad range of goals (though there are a few exceptional people here).

Although the equations of probability theory and expected utility do not state that you have to be a "Spock rationalist" to use them, in reality I see more Spock than Kirk. I myself am not exempt from this critique.

What, then, is missing?

The problem, I think, is that the original motivation for Less Wrong was the bad planning decisions that society as a whole takes3.  When society acts, it tends to benefit most when it acts in what I would call the Planning model of winning, where reward is a function of the accuracy of beliefs and the efficacy of explicitly reasoned plans.

But individuals within a society do not get their rewards solely based upon the quality of their plans: we are systematically rewarded and punished by the environment around us by:

  • Our personality traits and other psychological factors such as courage, happiness set-point, self-esteem, etc.
  • The group we are a member of, especially our close friends and associates.
  • The shibboleths we display, the signals we send out (especially signaling-related beliefs) and our overall style.

The Less Wrong canon therefore pushes people who read it to concentrate on mostly the wrong kind of thought processes. The "planning model" of winning is useful for thinking about what people call analytical skill, which is in turn useful for solitary challenges that involve a detailed mechanistic environment that you can manipulate. Games like Alpha Centauri and Civilization come to mind, as do computer programming, mathematics, science and some business problems.

Most of the goals that most people hold in life cannot be solved by this kind of analytic planning alone, but the ones that can (such as how to code, do math or physics) are heavily overrepresented on LW. The causality probably runs both ways: people whose main skills are analytic are attracted to LW because the existing discussion on LW is very focused on "nerdy" topics, and the kinds of posts that get written tend to focus on problems that fall into the planning model because that's what the posters like thinking about.



1: simple calculation and logic is not usually mentioned on LW, probably because most people here are sufficiently well educated that these skills are almost completely automatic for them. In effect, it is a solved problem for the LW community. But out in the wider world, the sanity waterline is much lower. Most people cannot avoid simple logical errors such as affirming the consequent, and cannot solve simple Fermi Problems.

2: I am not trying to cast judgment on the goal of being an intellectually focused, not-conventionally-socializing person: if that is what a person wants, then from their axiological point of view it is the best thing in the world.

3: Not paying any attention to futurist topics like cryonics or AI which matter a lot, making dumb decisions about how to allocate charity money, making relatively dumb decisions in matters of how to efficiently allocate resources to make the distribution of human experiences better overall.


New Comment
274 comments, sorted by Click to highlight new comments since: Today at 10:42 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

There's too much individualism in the current LessWrong rationality. I remember a folk tale I read, describing the adventures of two individuals named something like Solves-Problems-By-Himself and Asks-Others-For-Help. Given the task of preserving meat from rotting, the former shielded the meat from the sun with large leaves and dripped water on it. The latter gave away the meat in exchange for an identical piece delivered at the end of the contest.

It was sort of cultural-shock jarring to me when I read it, because "obviously" producing the "identical" piece shouldn't be counted as having preserved the original. But we have too many lone-hero-genius stories, and not enough "so-and-so was stumped so he asked his sister" sort of stories.

Afaik, LW is a spin-off from work on FAI. Since FAI needs to be gotten right the first time, it isn't surprising if LW is oriented towards planning.

One thing I haven't seen discussed is the process of generating new ideas. I was thinking about this as a result of the lurkers thread-- some people said they didn't post because anything they thought of had already been said.

[-][anonymous]13y 14

I'm a lurker, and have read more posts than comments, but it seems that there actually is attention paid here to how to increase one's emotional intelligence. How to influence one's own emotions, how to help emotions correspond to reality, how to forget them when it's necessary. The same issue which (to my limited knowledge) Roman philosophers were very concerned with.

Maybe it's not explicitly applied, in a "How to Win Friends and Influence People" style, but I think the population here (myself included) tends to be turned off by that style. This blog's style is the type of style someone like me can understand and apply. But I think we need to distinguish between style (which is optimized for nerds) and substance (which is quite human and universal, as I understand, and not at all confined to making "society" better at the expense of the challenges of one's own life.)

In other words, if LW isn't helping me win, then it's I who am doing something wrong.

Yes, it is true -- debiasing is a form of emotional intelligence. Do you find that you personally do well at challenges that require emotional intelligence? (Obviously you don't have to give specifics in a public forum, but in general) If a third party observer assessed your life, do you think that they'd put you above the 90th/80th/70th etc percentile in terms of: * Quality of personal relationships * Ability to successfully deal with others and comprehend social networks * Ability to read others' emotions and project the right emotions to others * Ability to be in control of your own emotions and sense and express how you feel?
No, I'm not very good, which is actually why I'm here. My personal relationships are fine; my emotions, not so much.

As I wrote earlier on LW:

If you ask me, the term "instrumental rationality" has been subject to inflation. It's not supposed to mean better achieving your goals, it's supposed to mean better achieving your goals by improving your decision algorithm itself, as opposed to by improving the knowledge, intelligence, skills, possessions, and other inputs that your decision algorithm works from. Where to draw the line is a matter of judgment but not therefore meaningless.

Skills other than rationality matter a lot, and a rational person will seek to learn those skills (to the extent that they're sufficiently useful/easy), and it isn't implausible that those skills should be discussed on LW, but that doesn't mean there's something wrong with our conception of rationality.

ETA: I guess you could argue that there's different skills involved in being rational about nerd topics and being rational about non-nerd topics, and we haven't focused enough on the latter.

The group we are a member of, especially our close friends and associates.

I think who you know is probably the most important element of social and financial success. To win more rationalists need to help each other along, for example, by hiring and mentoring each other.

Seriously though, people always underrate how important this is.

You may be interested in the Existential risk career network [http://www.xrisknetwork.com/] if you are thinking of making even moderate donations to reduce existential risks. if you email me at rmijic "at" googlemail dot com, I can get you in touch with Frank who runs the network.
And what is your domain of expertise? [http://lesswrong.com/lw/26l/what_are_our_domains_of_expertise_a_marketplace/]
Philosophy, public policy (and political strategy), writing and decent grounding in cognitive science and associated fields. Which I felt a lot better about before the recession.
That's a tough one; you can do a wide range of things with that background, but it's not specialized enough to easily bootstrap yourself with a first job. I'm sure I am saying the obvious, but law is a good possibility with that background. Another possibility is to specialize in something more applied, if you can stay in school longer. What do you see yourself doing, short of being a political strategist?

I think you make some great points in this post:

Our skill in dealing with people, which we might call "emotional intelligence".

There used to be more activity on this regard with a lot of people writing about the pick up community and pick up artists. Unfortunately there were complaints about women being objectified and after that you didn't read much about this topic anymore.

Another thing is that I have the impression that LW is becoming more and more about signaling(I got this from Robin Hanson's writings) rationality as opposed to actually ... (read more)

Another thing is that I have the impression that LW is becoming more and more about signaling(I got this from Robin Hanson's writings) rationality as opposed to actually working on it. That is, what counts is making an elaborate post with sophisticated reasoning in order to impress, regardless whether it can or will be actually implemented. Maybe useful if you are bulding a GAI, not so much if you want to improve yourself.

Not more and more. You're just becoming more socially aware. You've taken the Red Pill, now the trick is to learn to live with what you see, without becoming embittered. Because, as the name suggests, LessWrong has slightly less signalling bullshit relative to information than average for humans. That's the best you can expect, now make the most of it.

Or he just watched The Matrix. I don't think Roissy can take credit for that particular expression.
Virtually nothing Roissy writes about game is new or original, including his reference to the Matrix, which I've seen used about 5 years ago. He just succeeded in popularizing it to new audiences, and linking it with conservative politics. Roissy is not an important or representative figure in the larger community.
As a note of caution to those not familiar with the name, Roissy might be construed as an example of the 'worst advocates' of PUA by many. He is a talented writer and at least thought provoking, arguably quite insightful, but Tyler Cowen described him as EVIL [http://www.marginalrevolution.com/marginalrevolution/2008/06/where-pretty-li.html] and he is not for the faint of heart. Anyone tempted to google the name is probably at least owed a NSFW warning (purely text based NSFW) and possibly a NSFTEO (The Easily Offended) warning.

I stopped reading his stuff when I realized it was having a negative effect on how I think of women, sexuality, and my own sexual identity. (I am a hetero male).

Funny that he does so partially for a set of reasons that are falling into disfavour on LessWrong. This also piqued my interest: I'm not quite sure what he is referring to.
Who on earth is Roissy? Never heard of him.
Roissy [http://www.takeonit.com/expert/777.aspx]. The title of his blog, "Roissy in DC" alludes to another Roissy [http://en.wikipedia.org/wiki/Roissy-en-France#Literary_reference] of literary significance.
Roissy talks about "game" which has been de-facto banned from LW because it causes the site to go all ga-ga about gender politics.
Regrettably, I think that LW is not yet strong enough to tackle this issue head on. Note though, that if we move LW more towards real-world winning, we may find that the discussion quality for real-world issues gets to the stage where the issue can be mentioned.
Whence this idea?

Rarity value. :)

This seems to be my impression too, though I must admit to possible bias.
Observation of a few weeks of commenting would make this apparent to anyone with a modest amount of social awareness. This is not something that should be surprising. I find my own popularity skyrocket when doing, for example, yoga. Being the scarce gender gives all sorts of power! :D
I honestly don't see it and think of myself as being fairly socially aware (though I suppose I wouldn't know if I was wrong about that). What power do women here have that men don't?

I'm not sure that higher status (as claimed in the OP) is quite the right way to phrase it but it seems to me that women are treated differently and I am also aware that I do it myself. Some observations:

  • Society teaches us that women are fragile and that men should be careful not to hurt them. This is an observation about physical differences but it carries over to verbal interactions. I find myself being more careful with my words when I know I am communicating with a female. I get the impression that others are too.
  • Women are a minority here and we are often reminded of this and encouraged to create a welcoming environment for them. I know of no other clearly defined group we have a similar community norm for.
  • Many males here are of a personality type that means they will not have a lot of natural success with women and will be enthusiastic about the prospect of talking to women who appear to share their interests. They will not always act on this enthusiasm in an effective way. This phenomena is also common in other communities that attract similar personality types.
  • Females legitimately offer a perspective that is novel here and so their insights are inherently more valuable than those that can be offered by the many nerdy male computer programmer types who share similar perspectives with each other. They thus are more likely to post insightful comments than average in some areas.

There's probably others but I hope this gives you some things to consider.

Good points. I can see some of this. I also agree that "higher status" is not the way to describe it. My experience, here and elsewhere, is that on average women are more interested in exploring ideas and less in getting into a debate. A very high percentage of comments here basically say "Some aspect of the post above this one is wrong" so often my default reaction to a reply to one of my comments is to take the reply as a reason why I'm wrong. But I've found this default fails more often when the replier is female. Relatedly, I do find myself being less belligerent and aggressive in replies to women, but I think this is mostly me just matching their tone instead of automatically altering mine when I see they are a woman. Another aspect of this issue, is that debating between males probably triggers egos more readily than debating between males and females, so perhaps some men here are less aggressive when arguing with females because females don't trigger programming that evolved to guide us in battles for alpha male status, tribal supremacy and mating privileges. This might be one reason more female posters would help the community, egos care not for truth. True. But this isn't really because women are special or unique among such groups, it's because we aren't even diverse enough to worry about other groups. There are at least enough women here to point out blind spots and excluding language. Not so for lots of other groups. (I've felt like we do pretty well on neurodiversity issues, for similar reasons.) Okay, I have seen this. Heh.
I have to agree with Morendil below, AFAIK there are only few women contributing here and at the same time there is definitively some concern about increasing this number, I remember that Eliezer wrote one post about this. You see Alicorn, women are not the only group who is underrepresented, but I don't see the same concern regarding others. I see a pattern here as in the real world "women need more fairness" and the end result is often that privileges are granted to them. Your post [http://lesswrong.com/lw/134/sayeth_the_girl/] certainly contributed to that impression. It's just an impression, maybe I'm seeing it totally through the wrong lenses. Eliezer says(sorry for another example regarding PUA but it's just so salient to me, and yes, unfortunately I feel the need to excuse myself, strange, isn't it? Why do I feel this need?)In the end, PUA is not something we need to be talking about here, and if it's giving one entire gender the wrong vibes on this website, I say the hell with it. [http://lesswrong.com/lw/13j/of_exclusionary_speech_and_gender_politics/]. What is wrong with this sentence? What about those here who maybe want to discuss this? Shouldn't they be entitled to it? So there is more concern about a gender that is underrepresented(people that are not even here) but that could hypothetically contribute more members in the future as for those who are active or passive(lurkers) members in this community and have great interest and could possibly learn a lot from this topic. Alicorn I read your comments on how to make friends and there are similarities to PU. And for those women/men who don't want to read about PU why not ignore the respective articles? Edit: sorry for making my argument so PU centric. It's just something that was salient to me.
That's a very interesting point. I suspect it's because there's a lot of social pressure to assume that something is wrong if a group contains no or few women, while the lurkers don't have a political constituency. I think it would be worthwhile for LW to be a more comfortable place for women, but figuring out how to encourage people who are already interested and would be valuable contributors to post is important even though it doesn't have obvious signaling value. All I can say is that people don't necessarily work like that. If they don't have a strong preference for a social group, they aren't going to ignore things they don't like. Also, a common reaction to PUA isn't "don't want", it's revulsion. There's a spread effect.
I think roland's point is that neither of these reactions are terribly appropriate for a community of aspiring rationalists. The conflict between this and the desire for broader appeal is really at the heart of the issue. Personally I am an 'elitist prick', at least in the context of this site. I want to be able to freely discuss things that are not usually discussed because of revulsion reactions. Robin Hanson's fearless approach to this is what originally drew me to Overcoming Bias. I have plenty of real world friends and acquaintances to discuss safe topics with, the value to me of this site is the ability to discuss things that are not safe topics amongst normal people.
Have you noticed that the people who are remembered as making the most accurate and useful observations about PUA are the same people who didn't cause the disgust reaction? Also, remember that every post we promote is another possible first impression. I wouldn't want to join a community of people, mostly men, describing Women as an undifferentiated mass distinguished by an array of mental flaws to be exploited for personal gain - that's a sign of some hardcore irrationality, to make claims which are that self-aggrandizing and that easily refuted. (Easily refuted because they are hasty generalizations, I hastily add.) Edit: I'll admit that I'm exaggerating the degree of bad rhetoric displayed here during the whole PUA flamewar, but the point about the hasty generalizations shouldn't be ignored - I know too many people who don't fit the stereotypes promoted in those discussions to view these stereotypes sympathetically.

I wouldn't want to join a community of people, mostly men, describing Women as an undifferentiated mass distinguished by an array of mental flaws to be exploited for personal gain - that's a sign of some hardcore irrationality, to make claims which are that self-aggrandizing and that easily refuted.

I wouldn't want to join a community that did those things, or which uncritically praised a community that did. Still, I think that even if the seduction community were an undifferentiated mass of irrationality, it would be worth discussing here for the same reasons that we talk about religion and astrology.

Personally, when I see people being successful in a certain domain (or believing that they are successful), yet holding some obviously irrational beliefs, my interest is piqued. If these people are successful, is that despite their irrational beliefs, or could it be because of those beliefs? Could it be that some of the beliefs of PUAs work even though they are not true?

I don't understand why other rationalists wouldn't be wondering the same things, even when confronted with the negative aspects of pickup. As I've argued in the past here and here, pickup relates to many rationalit... (read more)

(Building on this earlier comment of mine. [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1y0h]) I appreciate your list of connections between PUA and rationality, because it's gotten me closer to working out why I don't see PUA as having a special connection to rationality. I think it's because I find the connections you suggest generic. Most of them, I reckon, would hold for any subculture with a sufficiently active truth-seeking element, such as (picking a few examples out of thin air, so they may not be good examples, but I hope they communicate my point) poker, art valuation, or trading card gaming. Though I'd guess that each of these topics has links to rationality like those you mention, in depth discussion of them on LW would tend to feel off-topic to me. This doesn't really relate to the more typical complaints about PUA that I see upthread - i.e. that some of the discussion of it grosses people out, and that it's inaccurately reductive - but I thought I'd add my two cents to convey my mental context for my last reply. [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1y0h]
Thanks for giving additional context. I think you are correct that we have a difference of opinion. Personally, I would be absolutely thrilled to see a discussion on LessWrong of how poker, art valuation, or trading card gaming relate to rationality. Would these subjects not interest you, or is your worry that discussion of them would get too far off-topic to a degree that is bad? I suppose delving very deep into those subjects could also feel off-topic to me if the connection to rationality was lost, yet I would be comfortable with whatever level of depth people more knowledgeable than me on those subject felt was necessary to elucidate the links to rationality. (And if other people were making truth-claims about the content of those disciplines, and those people often displayed bias or misunderstanding in either a laudatory or critical direction, I would be comfortable seeing those truth-claims evaluated. Even if debate about the merits or nature of a subjects gets away from the direct relationship of that subject to rationality, that debate itself may demonstrate applications of rationality to a controversial subject, which I like to see.) Your mileage may vary, but I find that I learn in a "hands on" way, and attempting to apply rationality to a practical problem helps me attain a more abstract understanding. See the notion of Contract to Expand [http://greenlightwiki.com/heuristic/Contract_To_Expand], where sometimes solving a specific sub-problem can be helpful for solving a larger, more general problem. I would consider any subculture or discipline with a "sufficiently active truth-seeking element" to be excellent LessWrong fodder, as long as the discussion (a) was connected to rationality, or (b) addressed the nature of the subcultures and disciplines so that readers can know how they work well enough to evaluate their potential relationship to rationality (particularly if there is disagreement on that nature or relationship). Anyone else have feelings ei

Would these subjects not interest you, or is your worry that discussion of them would get too far off-topic to a degree that is bad?

The second I think. (I feel about the same for topics in which I have shown interest, so it's not about my level of interest.)

If I wanted to force a conversation about a particular subculture or hot-button topic not obviously related to rationality, and I were called out on it, I could probably contrive a defensible list of ways my desired subject relates to rationality. For example, I took your list of bullet points for PUA and adapted most of them to race and IQ (a subject I'm more familiar with):

  • Instrumental rationality (IQ relates to indicators of life success, so one can argue about the degree to which IQ is a measure of instrumental rationality)
  • The availability heuristic (use of convenience sampling when testing psychological subjects; availability bias as a source of racial stereotypes about IQ)
  • Underdetermination of theory by evidence, and the problem of induction; how much ad hoc support which should allow to a theory about race differences in IQ before we trash it
  • Self-fulfilling prophecies (stereotype threat and other situations where
... (read more)
Regarding the ratios of comment types have you compared that at all to subthreads about other topics, possibly less controversial ones? Without some idea of the usual level for an equivalent LW conversation about a less controversial topic, it is very hard to evaluate this data. I'm not sure incidentally that I agree with your breakdown of comments. For example, you include the comment that started off the conversation as in none of the categories. Even just asking a worthwhile question should be worth something. And since this comment was at +17, even just by removing it we already substantially alter the average score of the 50 nones. The score goes from 2.7 to 2.4. This also illustrates another issue which is that if even a single comment can cause that sort of change then it doesn't seem like this sort of data is statistically significant. Frankly, after realizing that, I'm not that inclined to check the rest of your data since that already puts the two at both 2.4 on average. The fact that it seems like this comment itself would be put into the none category when I've made criticisms of the interpretation of evidence suggests that your break down isn't great. (Please forgive the mild amount of self-reference.)
It would be interesting to see what the patterns would be like in other subthreads. I sampled only the one subthread because I was curious about variation among comments within the single subthread and not variation between subthreads, so I figured one subthread would be enough. It's certainly not perfect! I would have liked to have used a finer and more sensitive breakdown, but it would have become difficult to apply. I tried to invent the simplest breakdown I could think of that wouldn't need much subjective judgment, and could approximate the types of discussion HughRistik had in mind. That's true - my list of categories is conservative, so some well-regarded comments that didn't discuss data, predictions, or heuristics nonetheless didn't end up in a category. That said, although my category list wasn't exhaustive, I did still expect about as many comments to fit a category as there were comments that fitted none - I was genuinely surprised to get a 2/3 to 1/3 split. Fair point. The distribution of comment scores in that subthread is very skewed with a few outliers: If I drop the four high scorers on the far tail I can recalculate the averages for the 'nones' versus the non-'none' comments without the influence of those outliers. The 47 remaining nones' scores have mean 2.0 and the 23 remaining non-nones have a mean score of 1.8; the gap shrinks, but it's still there. If I did a statistical test of the difference, it likely would be statistically insignificant (and it'd likely have been insignificant even before dropping the outliers) - but that's OK, because I don't mean to generalize from that one subthread's comments to the population of all comments. Yes - if I planned to apply the breakdown to other subthreads, I'd add a category for comments that criticize or discuss evidence mentioned by someone else. Fortunately, it shouldn't make much difference for the particular subthread I picked - I don't remember any of the comments making detailed criticism
Piling on to this excellent comment, I have a more specific interest in "how scientific is NLP".
That is indeed a good question that I don't know the answer to. Though it has been my impression that some of the ideas in NLP are parasitic on mainstream psychology. For example, "anchoring" seems related to classical conditioning.
I think it is a species of logical rudeness to judge an idea by its worst advocates. I'm sure any atheists who have been reminded that Hitler was an atheist can sympathize. Neil Strauss (author of The Game) recently made some good points [http://theobsidianfiles.wordpress.com/2010/04/27/a-word-from-neil-style-strauss-about-haters-of-the-game/] :
I think we're talking past each other. I'm not talking about judging the ideas, I'm talking about judging the worst advocates. Those people are the ones who cause the revulsion, and we as a community need to deny them the spotlight when they act up until they learn better. Otherwise the community comes off as not being a rationalist community, and aspiring rationalists who might be interested walk away. I don't even think we've been doing a bad job overall. But it's a job we're doing, not something that happens automatically.
And this is where differing perceptions are probably causing issues. I haven't seen any posts here from anyone who is anything nearing the worst advocates, but then I've hung around places where these topics are discussed much more confrontationally. I've seen nothing I deem worthy of censorship from the advocates, even the 'worst', but I have seen examples of what I view as completely unacceptable over-reaction, revulsion and guilt tripping from a small but vocal minority who claim offense. I am very unsympathetic by nature to people who claim the right to block any conversation that they personally find offensive. My natural reaction to such people is to become more offensive, which while it has some merit from a game-theoretic standpoint is generally not conducive to social decorum so I make an effort to restrain such impulses. So for me, those people are the ones who cause revulsion, and we as a community need to deny them the spotlight when they act up until they learn better. Otherwise the community comes off as not being a rationalist community, and aspiring rationalists who might be interested walk away. So far people who share your perceptions seem to carry the support of the majority but I think there is a significant minority that share my perceptions.
Despite some of the rhetoric flying around at the time, I don't think anyone involved made that sort of claim. It was rather more like "I find this sort of thing offensive" and "Maybe we should listen to him, since lots of people probably would be turned away by that sort of thing, and the offensive bits aren't really necessary." See Eliezer's contribution, Of Exclusionary Speech and Gender Politics [http://lesswrong.com/lw/13j/exclusion_vs_objectification/]. Nutshell: We should avoid doing things that make people feel excluded, and that includes being sensitive and not being all feministy. So basically we want both of the potentially-interested groups you've identified to stay. ETA: Surely I've overstated my case. Eliezer did suggest that he didn't think it would be a problem to ban PUA if it bothered people; the main idea is that PUA isn't that important of a topic in the grand scheme of things, so whatever.
It's not an either-or proposition, I think. I'll freely concede that I haven't been particularly sensitive to those sharing your revulsion for political correctness*, but it would be a mistake to offend either group to flatter the other. It's possible - it's even been done here - to hold these discussions in a way which is fair to both sides. It's just hard. Which is why it's usually a bad idea to go there. * I apologize if my terminology is incorrect.

It's just hard. Which is why it's usually a bad idea to go there.

Agree with first quoted sentence. Disagree with second one.

In my view, LessWrong should be a place where we rationally attempt to discuss subjects that would be too controversial to discuss anywhere else. On LessWrong, we can hold arguments in such discussions to higher standards of scrutiny than anywhere else.

I don't agree with the "it's hard, so we should give up" approach to discussing controversial subjects on LessWrong. Controversial, mind-killing subjects are exactly where rationalist scrutiny is most needed.

Here's a potential conflict in our views of LW's purpose. I think of it as being about discussing rationality, and things that touch directly on rationality and being rational. In that case discussing controversial, mind-killing subjects is only relevant inasmuch as they cast light on rationality - they're not inherently interesting. I've posted here before about race/IQ and global warming, and for both of those I've felt as if I was covering territory that's basically offtopic. This didn't stop me from posting about them, or make me feel bad about it, but I did feel that if I had picked arguments about those topics just because I could, that wouldn't have suited LW's purpose. I would avoid writing a top-level post about subjects like that unless I thought it was a good way to make a compelling, more general point about rationality - otherwise I'd likely just be axe-grinding.
To me, it seems obvious that there a lot of links between pickup and rationality (both positive and negative). It's occurred to me that perhaps I've been over-estimating the obviousness of those links to others who don't have the same background in the subject matter, so I've tried to sketch out a bunch of them in my reply [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1yal] to RobinZ.
I'm down with a "one does not simply walk into PUA" attitude. I apologize for not saying so.
We may need a category of "this is too hard for us now", with the possibility left open that as more of us get better at rationality, more difficult topics can be addressed well.
Your terminology is fine. The asymmetry that disturbs me is that while 'political correctness' annoys the hell out of me I'm not demanding for it to be a banned topic of conversation to avoid offending my delicate sensibilities. I don't consider the causing of offense by particular views or topics to be a valid reason to avoid them. Note that this is different from discussing them in a deliberately offensive manner. I generally dislike an unnecessary impolite or aggressive tone to discussions but objecting to an entire topic is going too far in my opinion.
You're correct. My "usually" was an attempt to acknowledge this - in retrospect, not a competent one.
If we are still having this discussion could you link to a couple examples of the posts that you object to so much? I'm trying to figure out whether I missed something, or how similar my perceptions are to yours.
I can't point to any specific examples.
I haven't read the worst advocates. My negative reaction was based on reading material by average or possibly somewhat above average advocates. I wonder what the common reaction to feminism is here. It's got at least as wide a range as PUA.

I broadly agree with the feminist project and think they have done more good than harm. I also have the following criticisms

  1. Feminists too often mistake the complex, dynamic and context-dependent way status/power actually works for an oversimplified "patriarchy" where men as a class oppress women as a class.

  2. This means feminism is much more sensitive to sexism against women and will routinely miss or play down sexism against men. This wouldn't be a problem except that feminism has sort of universalist aspirations; they're often more like a special interest group.

  3. Feminism sometimes advocates taking political roles that can be oppressive, in much the same way gender roles can. This is partly why the movement has had trouble embracing transgendered people, BDSM, porn stars, sex workers etc. (And why the views of so-called 'radical' feminists still can't accept these groups)

  4. Despite talking a lot about intersectionality, the core feminist institutions are more like a voice for Western, white, upper middle class women than for women as a whole. (A criticism I feel kind of like a dick making as I am all of those things+ a man, but it's true).

  5. The movement isn't a good p

... (read more)
The thing is, there isn't a movement for gender equality. It seems to be very hard to motivate people to work on things without building in a group identity, and the group identity is us vs. them. Or have I spent too much time reading people who work that way, and there are alternatives I haven't seen? I've wondered whether people have a bias towards bad ideas. Simple good sense isn't dramatic enough (or possibly doesn't offer enough opportunities for power seeking) to get attention easily. Still, there's some good work being done, and I think of this as an effort to figure out how to live well with other people-- something which is surprisingly difficult.
Well, humans seem to be wired that way, so anyone you've met who works differently has done so deliberately and is very strange.
That sounds like 'radical feminism', and it's not so much a mistake of the 'feminism' as it is of the 'radical'. Marx did the same thing with class. Feminists actually have a lot of complex, dynamic, and context-dependent reasons for focusing on sexism against women, ranging from being a radical feminist, to thinking sexism against women is the bigger problem that needs to be dealt with, to thinking that's what 'feminism' is about by definition and someone else should have the job of being sensitive to sexism against men. It's one reason "women's studies" programs in universities have been slowly converting themselves over to "gender studies", to drop the female-centric nature as it's no longer needed. This has been internally considered a big problem for a long time, and now remains a problem only if you look at Western, white, upper middle class feminist institutions specifically. Indeed. There are a lot of crazies out there. But I'd make the same case for just about any 'movement'.
Well, my sense is that the simple view is (a) what the radicals hold and (b) what those who don't get into the theory end up believing. It's kind of like how the Catholic church itself doesn't think God is necessary for morality but this view is common among evangelicals and unstudied Catholics. Being a radical feminist isn't really a reason for doing something, what are the reasons for being a radical feminist? Anyway, my understanding of the radfem position is that there is no such thing as sexism against men, so yes, they're not going to be paying a lot of attention to sexism against men. These reasons I'm pretty much fine with (and mostly agree with), which is why the problem isn't that they aren't good at noticing sexism against men but that they're aren't good at noticing sexism and take themselves to be giving a universal and unbiased perspective on gender issues. Feminism has problems being both the major vehicle for gender egalitarianism and the major vehicle for empowering women. The contradictions here were extremely minimal when feminism started out, but of course the more success feminism has the more this contradiction will come into play. I agree that it has been a problem internally. And maybe I need to make this more clear: I basically have one foot in the camp and one foot outside it, so some (maybe even most) of my criticisms are things that feminists have said themselves. I'm not sure I know what you mean by "remains a problem only if you look at ...". I don't think there are many feminist institutions that identify themselves as Western, white and upper middle class. If you mean the institutions that are made up of mostly Western, white and upper class women then I suppose I agree with you except that these are the best funded, most influential and, for the rest of the culture, defining institutions for feminism. My experience reading non-white, poor and non-Western women on this subject suggests they still perceive many of the same problem
Yes, I'd have to grant you that, and I think the rest follows. I get the impression it's moving in the opposite direction. The shrill radical sorts are being de-emphasized (not least since everyone noticed political correctness is silly), and as I noted "women's studies" is slowly transforming into "gender studies". The major battlegrounds now, as I see them, are on exactly these sorts of questions. Is gender egalitarianism possible? Is it valuable? Are there factors which explain things like income disparity, and what, if anything, should we do about them? But then, I haven't really been following the literature for a couple of years.
I see what you mean here. I think it's part of the same process. Equating gender egalitarianism with empowering women doesn't make quite as much sense as it once did. And for this reason radical feminists are losing influence, their message doesn't resonate like it used to. But at the same time aspects of the radical view haven been embedded in a lot of feminist 101 stuff (just think, for example, about the concept of the patriarchy) and mainstream/liberal feminism is having a really hard time getting away from that.
Sounds like we're on roughly the same page.
Personally my general reaction to feminism is negative but it appears to encompass a sufficiently diverse range of viewpoints that I find myself agreeing with some subset of those viewpoints. My impression is that rationality is not a strong feature of feminist thought but I recognize that I have probably been mostly exposed to the worst advocates. The most convincing advocate of feminist ideas I have encountered is Kerry Howley [http://en.wikipedia.org/wiki/Kerry_Howley]. I think I can probably stomach feminist ideas she espouses because they are sugar coated in a libertarian wrapper. I'm not even sure that she would self-describe as a feminist but I feel that what sympathy I have for feminist ideas can in large part be credited to her writing.
I like Kerry Howley too. She does self-describe as a feminist. She's in the tradition of Voltairine de Cleyre. I grew discouraged by feminism as represented by, say, the writers at feministe. There was a great deal of opposition to thinking the wrong thoughts. But you're right, it's an extraordinarily broad area, to the point of (almost) not being a useful term.
I think there is a parallel to the complaints about the PUA discussions here. I've often seen feminist ideas presented in a tone of hostility and misandry and embedded in a whole heap of background assumptions and beliefs that I do not share. I can read some of the same ideas from someone like Kerry Howley and appreciate that they are actually quite reasonable and compatible with my own views because I am not immediately on the defensive and looking for disagreement.
I also feel this way about criticisms of feminism. A lot of it comes from this entitled, resentful and misogynist place which aggravates me. I find that even among the most reasonable critics of feminism this attitude has a tendency to come out from time to time.
Is there anything in particular of Kerry Howley's that you recommend? This [http://charleswjohnson.name/essays/libertarian-feminism/] might be interesting-- it's an analysis of the similarities between feminist descriptions of the patriarchy and libertarian descriptions of the state, with the suggestion that libertarians and feminists could learn quite a bit from each other.
Here's a few on libertarianism/feminism: Libertarian Feminism versus Monarchist Anarchism [http://kerryhowley.com/2008/11/07/libertarian-feminism-versus-monarchist-anarchism/] Feminism and Libertarianism Again [http://kerryhowley.com/2008/11/10/feminism-and-libertarianism-again/] Does the Word Feminism Mean Anything [http://kerryhowley.com/2008/11/15/does-the-word-feminism-mean-anything/] And on reproductive/sexual issues: Notes on My Life Sentence of Buried Self-Negation [http://kerryhowley.com/2008/07/31/notes-on-my-life-sentence-of-buried-self-negation/] Trying Really Hard to Get Upset About Pornography [http://kerryhowley.com/2008/09/17/trying-really-hard-to-get-upset-about-pornography/] Might There Be a Connection Between Slut Shaming and Slut Jailing [http://kerryhowley.com/2008/03/11/might-there-be-a-connection-between-slut-shaming-and-slut-jailing/] The Myth of the Migrant [http://reason.com/archives/2007/12/26/the-myth-of-the-migrant]
Thanks for the link, it's an interesting article. I don't find much to take issue with there - I generally agree with their analysis. Unfortunately I see little evidence of any progress towards reconciliation. I find the focus on radicalism as a common trait interesting. I see parallels with coverage of the financial crisis where I basically agree with much of the analysis of people like Matt Taibbi [http://zerohedge.blogspot.com/2009/06/goldman-sachs-engineering-every-major.html] or Simon Johnson and James Kwak [http://13bankers.com/] on the root causes of the financial crisis but have a rather different idea of what needs to be done to fix the problem. The ideas of a feminist-libertarian alliance and a left-libertarian alliance have many commonalities.
Agreed. Still my point remains, to what extent should a group stop doing certain activities to accommodate hypothetical future members who might or might not join even if the group ceases doing said activities.
A fair question, though it's worth noting that those particular activities were also annoying some current members.
Indeed - at the time, at least two of the site's "top contributors" were specifically put off by it.
I actually don't see why lurkers as lurkers should have a political constituency. They don't contribute to the site by definition. Any given lurker is welcome to become a poster and then they will be part of their own political constituency.
I wasn't saying that they should, just that they don't. Even so, it's possible that they should have a constituency of sorts if you want the site to grow.
If that is the case, then it certainly is not because of a general "Everyone is entitled to discuss whatever they want here" principle, as such a principle does not exist. The site's purpose is rationality, and anything that serves that goal is allowed, and anything that does not is suspect.
Did you even read the OP? He specifically mentioned the subject of mastering interpersonal relationships and I was answering to that.
The thing is, that's an odd response. Look at the flow here: Roland sez: should we really give up discussing PUA just to make women feel more comfortable? Kaiokan sez: I don't expect that many women on the site in the first place, because of XYZ, where XYZ is a fairly ambitious claim that's likely to be disputed in itself. Without XYZ, I think most of us, men and women, could agree on the basic point you're trying to make, that is, we expect more men than women on the site. So why bring up XYZ? It doesn't actually have a function in your argument other than the fact that you like it and you found an excuse to bring it up. (I'm often guilty of this too, but I suspect it's bad logical hygiene and I'm trying to get rid of those habits.) As for the actual question... well, it depends if we can trust ourselves to handle it well. Apparently the convention around here is that we don't bring up topics that totally overwhelm rationality, because we're trying to practice rationality. But ultimately we do live in a world where hot-button subjects exist and we have to respond to them one way or another, and potentially not just by avoidance. For somebody from my environment, and it seems for many others, the topic of PUA is a challenge. Maybe a fun challenge. But possibly one we can't handle well (yet).
You need to back up that particular non-obvious statement with the reasons which are currently convincing you of that statement [http://lesswrong.com/lw/ju/rationalization/], yes. Your response to research suggesting that the factors you cite are nonpermanent [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1y1a] would be appreciated.
Thank you for not wanting to sound intolerant. As a rule, though, if you don't know how to say what you want to say without disclaiming it, you have a lot of work to do before saying it.
That's not what I meant - your disclaimer was a warning from yourself that you should not be confident that you have avoided saying something intolerant. In such a case, simple editing is rarely the solution; if you don't understand the situation enough to be justly confident to start with, you don't understand it well enough to confidently make edits. In this case, you have repeated a number of strong claims about sex differences without acknowledging any of the evident cultural factors - it is these specific features of your comment which are likely to make you appear intolerant, and they are (ironically) more prominent after your edit.
I think you are being a little unfair. He stated two fairly well established psychological facts (greater variance in intelligence and differences on the empathizing–systemizing scale), a personality tendency with decent ev-psych support (lone-wolf) and a reasonable extrapolated hypothesis from these tendencies (male dominance of computing/math/engineering disciplines). He then made a clearly flagged personal prediction based on these observations that we are unlikely to ever see a high percentage of female commenters here given the subject matter. Any interpretation of a nature/nurture assumption is coming from you. He merely noted the differences and did not express an opinion on the reason for them. We can do better than the Larry Summers Harvard debacle here. Address the evidence for the claims or the specific reasons why a different tone would be preferred rather than engaging in pre-emptive censorship.
Apologies for the delay - this is not a field in which i have particular knowledge, and so it took me some time to track down an appropriate reference (h/t Jezebel blogger Anna North [http://jezebel.com/5275547/sorry-larry-summers-math-gender-gap-caused-by-culture-not-biology] ): Janet S. Hyde and Janet E. Mertz, "Gender, culture, and mathematics performance", PNAS, vol. 106 no. 22, June 2, 2009 8801-8807, doi: 10.1073/pnas.0901265106 [http://www.pnas.org/content/106/22/8801.abstract].
Thanks, this is the sort of specific evidence I was hoping for. I'll take a look.
Intelligence is a vague, multi-faceted word. Whenever intelligence is mentioned in a comparative or quantitative way, care should be taken to indicate exactly which dimension of intelligence is being measured. Since the dimensions of intelligence probably aren't well parametrized, it would be sufficient to indicate the particular test that was being used. Otherwise, the biases that sneak in are less traceable. In experimental science, it is a really good norm they've established to always include the detailed context and methodology of the experiment, so current researchers can estimate and predict biases and figure out 'what went wrong' when they get a different result under different conditions. For example, if it was a standard IQ test that determined the variance in male intelligence, I have an understanding of the biases in those tests, and if it was comparing income, I have an understanding of the biases there. When it comes to experimental studies in social science and psychology, I always weight their result low compared to my own observations of a lifetime, because if I've observed anything, I've observed that things are complex, and I know we haven't developed tools to handle this complexity.
Yes, the world is a complex place. Yes, any finding in the social sciences may not show what it purports to show due to biases and flaws in the methodology. We can do better here than simply ignoring all evidence on the basis that it might be wrong however. Remember that 'belief' in some idea is not a binary thing, 0 and 1 are not probabilities, all beliefs are open to future revision in either direction in light of new evidence. A rationalist should be trying to refine their degree of belief by asking questions and doing further research. Greater variance in male performance is both a widely observed phenomenon in many domains and something that you would expect to see given the differing selection pressures on males and females. It need not be an emotionally laden observation since it is not inherently implying that either gender is 'better' than the other in some way, it is merely an observed regularity of our world. So if you dispute the evidence for greater variance in male performance generally and in intelligence measures specifically please address your criticisms to specifics. What specifically are the biases in standard IQ tests or measures of income that you have an understanding of and how do they act to produce misleading results? What other data (experimental is preferable but anecdotal is admissible for consideration) do you have to offer on this issue? This is a perfect example of a question we can collectively apply our rationality to in order to improve the accuracy of our probability estimates. Or don't. Just say 'I don't believe any of this evidence should influence my beliefs because the world is complex and evidence can be wrong' if you choose. But do not pretend that that is either a noble or rational stance to take on an issue.
[-][anonymous]13y 18

I agree that we can do better.

I have a thought on these studies that give evidence for unequal intelligence between the sexes (or races.) They can have very scary, emotional connotations. They used to scare me. Then I thought about it a bit and asked "What am I scared of?" And I realized that I was scared that, if these genetic inequalities were real, I'd have to be a sexist or racist.

But think for a minute. Suppose the "worst-case scenario" were true. Suppose women really did have worse brains than men, for genetic reasons. What would be my logical response?

It occurred to me that the only responsible way to react to such news would be to treat it as a disease to be cured. And then start working on biology to fix it. I am not an anti-Semite because I'm aware that Tay-Sachs disease affects Ashkenazic Jews.

If there were genetic differences between sexes or races, I'd be less likely to favor affirmative action at the college or employment level, because it wouldn't be effective. The injustice would be biological, not social, and it would be best fixed biologically.

The real reason people are scared of genetic differences is the naturalistic fallacy. Jus... (read more)

With the caveat that a biological injustice and a social injustice are not mutually exclusive - there may be genetic differences between sexes/races, but that would not eliminate the possibility of additional unnecessary social barriers to college or work. ETA: I should also have remembered that 'biological' does not equal 'genetic.'
Surely the injustice here, if any, is that different individuals are differently intelligent, not that the average varies across groups?
Mostly, yes. Though there's an added harm if the average varies across groups, especially groups where membership is easy to recognize. Because then, people (reasonably) make generalizations, and high-IQ members of a low-IQ group are harmed by negative opinions. Add in the fact that people have biases, and stereotyping is likely to go even beyond what's reasonable, so the problem becomes even worse. But yes, if some individuals have low IQ for genetic reasons (or other reasons beyond their control) I consider it a bad thing and we ought to see about fixing it. I think Eliezer made this point earlier.
The injustice is that each individual is not maximally intelligent. The variance in intelligence between individuals just means that this is more unjust for some people than others.
I agree that's the main bad thing, but I'm not sure it would be properly called an "injustice", and I have strong reservations about the "maximally".
I'm not sure whether I'd want to be maximally intelligent. You could say the injustice is that individuals are less intelligent than would be optimal for their flourishing, or whatever.
One of the major concerns here are Gattaca-type scenarios, where when you're looking for very smart people you'll throw out all of the applications from females to maximize your odds of getting a very smart person. Obviously someone with the time to look at every application wouldn't do that, as the smartest applicant could still be a female. But usually there are some factors that you use first to throw away some of the stack so you don't have to look at them all.
That may be happening already. (Statistical discrimination is one model for employment discrimination, and as I recall it doesn't hold up too badly; better than the Becker model, at least.) It's not an intrinsically "Gattaca" idea. Of course, in a world where there was a biological "fix" for low IQ, you'd have the issue of whether it should be voluntary (I'd say yes) and whether people who don't opt for it should get preference from schools and employers (I'd say no, but tentatively) and what to do about access (it's complicated.) But I'm fairly confident that if IQ matters for real life outcomes, then a world where it can be improved technologically is a better one.
Yeah, I think this is a common reaction and it's not an entirely unreasonable reaction because sexism and racism are bad. But as you've realized it is better to know the truth if you want to influence the world in a direction consistent with your values. Just to be clear though, the claim is not that men are 'more intelligent' than women. It is that men have greater variance. This means more geniuses and more morons. It only carries value connotations if you believe more variance is inherently 'better', not if you believe higher intelligence is better. If you look at the scandal over Larry Summers' comments on this issue you will see that the vast majority of people who were offended by his comments did not understand this distinction. Yes, this is the key. It is always better to know the truth if you wish to effectively influence the future. You still get to choose your own values though - if the way things are is not the way you think they should be then believing true things is your best bet for effectively resolving that discrepancy in a favourable way.
Yeah, I'm aware about the variance thing. The "geniuses" side of the graph stands out to me more, but probably only because of personal relevance. (I never took an IQ test but I'd guess I'm more likely top half than bottom half.) But you're right, if there were higher variance among men, and if IQ mattered, then I'd believe we were also obligated to do something about low-IQ males. Richard Whitmire (see http://blogs.edweek.org/edweek/whyboysfail/ [http://blogs.edweek.org/edweek/whyboysfail/]) convinced me that there's a serious educational problem among US and European boys. He examines the education system, not IQ, but if some of it turns out to be biological then we should be working on that too.
I think the problems with the education system in the US and Europe are more fundamental than just that it is failing boys. The real problem is that we have an educational system adapted for early 20th century industrial societies that for institutional reasons has been unable to adapt to a completely transformed economic and social landscape. The most obvious victims are certain groups of boys but the whole system is completely unsuited to the modern world and is structured in a way that makes it largely incapable of correctly diagnosing or doing anything to fix the problems.
You and everybody else [http://en.wikipedia.org/wiki/Lake_Wobegon#The_Lake_Wobegon_effect]
Yes, but usually, when a grad student in mathematics [http://lesswrong.com/lw/26l/what_are_our_domains_of_expertise_a_marketplace/1xtf] believes she has above median intelligence with respect to the general population, she will turn out to be right.
The question isn't necessarily whether men in general are better than women in general. The active question seems to be whether it's alright for women to get high status positions.
The active question in what context? Clearly the variance issue has no bearing on this question. Whether greater variance in intelligence is a real phenomenon potentially has bearing on questions regarding whether institutional sexism in certain academic disciplines is a real issue and on the appropriateness of quota systems or 'positive discrimination' but I don't see how anyone who understood the issue could claim that it was not 'alright' for women to hold high status positions. I'm sure some people make that unrelated claim but people believe all sorts of crazy shit.
You read way too much into my comment! The issue seems to be, in your misreading of my comment, that you expect I feel emotional about gender issues, and perhaps feel threatened indiscriminately by any statements about gender and intelligence. But I don't. It happens that I'm not generally concerned about gender issues and am not on the look out to defend them. Personally, I don't anticipate feeling threatened by any statements that might be true about gender. On issues on which I am emotional (they exist), I feel much less threatened by statements if they are either clearly personal opinions/impressions or scientific statements that carry the specifics with them. I feel that if the specifics are there, I can trust that there is enough information to vet the statement, if needed, in the present or the future, and thus prevent inappropriate application. If the statement is a personal opinion/impression, we know the appropriate application of that. So if the point is to discuss issues "rationally", without stirring up emotions, then it would be a good norm to always signal clearly whether a statement is a personal opinion/impression OR a scientific claim. If the latter, it is the careful inclusion of the methodology/context that makes it scientific.
I apologize for jumping to conclusions. This is sort of why I think getting into specifics is important. If you just make a vague hand-wavey 'this might not be true' dismissal of a claim you leave your interlocutor with little choice but to try and guess what your true objection is and so read too much into your comment.
This isn't what I did. My criticism was fairly focused, with a fairly specific solution: The part that had you thinking I was dismissing the claim was probably this: It probably would have been wise to omit this sentence, since it caused so much bias about my intentions. My idea is that researchers do try to tackle complex subjects, like intelligence, and will measure something or do some experiment and report the results, but the interpretation or relevance of the study is all 'spin' in the Abstract or heavily dependent upon the reader's lifetime experience to understand the relevance. For example, what is "intelligence"? This is something that a group of researchers have to define, and have to measure in some way, in order to do their study and get it published. Consider the Dreary study [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1y1b]. They've measured something and called it general intelligence. This part is the spin. However, when you look at how they defined [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1y1r] "general intelligence" -- this is a scientific paper; they do tell us, and they're specific -- it is patent that they didn't include social intelligence, emotional intelligence or "street smarts" in this conception of intelligence. Requiring this clarification isn't dismissing the study results, it's just emphasizing that the context and the specifics are important.
See the Deary study [http://www.psychologicalscience.org/journals/pps/3_6_inpress/johnson.pdf] of practically the entire population of Scottish 11 year-olds, which found greater male variability. The introduction of the study also discusses the history of the greater male variability hypothesis, and some of the evidence for it. There is a cross-cultural study which found that males have higher variance in most populations, but females do in others. (Of course, this doesn't necessarily mean that the difference is "cultural," though it could.) I will try to dig it up. Even so, greater male variability is a robust finding.
This is the bit that I think is important when discussing results about intelligence: However, I'm not saying you need to include this information in your comment because you already made the context specific: the Deary study. So a person can dig deeper and find these details if they want to. Just to say, you didn't actually support this. Unless it is supported in the Dreary study?
It's supported at least by the combination of the Deary study, and the cross-cultural study I mentioned that I'll have to look up when I get home. I believe the author was Feingold. Good question, though.
Oh, I see I parsed your sentence wrong anyway. I thought there were some unidentified number of studies that showed women had greater variability.
My bad... The Feingold study is a meta-analyses of studies, some that find greater male variability, and some that find greater female variability in various dimensions.
huh. Well, does this control for age? The population should be around age 20, when both genders are at peak mental capacity.
This does not seem like an apt reply to the above. This: Is a particularly bad straw man. "I weight their result low" seems to be doing the sort of thing you advocate, as opposed to "no influence". Do you disagree with the general stance presented, that the methodology/context of an experimental study matters, and these things should be taken into consideration when evaluating their effects on our beliefs?
No, and I would welcome discussion of the specific issues that people are taking into account, as I said. I am open to the possibility that there is some major flaw that renders these results questionable that I have not previously encountered. What I am objecting to is precisely the lack of such specifics. It is not enough to say that things are complex and studies can be flawed or misleading. These are trivially true facts and to imply that your interlocutor is unaware of them on Less Wrong of all places is disingenuous. There have been a few comments in this thread that are taking the same kind of approach - dismissing the claims on the basis of unspecified flaws in the supporting evidence but never offering specific rebuttals to any of the disputed claims.
Don't worry, I also have a terrible style. Just continue practicing, it takes a lot of time to become a good writer. Eliezer has written some posts touching on this subject.
I'm not sure how this would come off as intolerant, though my detector for that stuff has been off the last couple of days. Anyway, whether or not you're right pretty much depends on what you mean by "substantial". Off the top of my head I can think of five or six female regulars. That isn't too shabby. There aren't that many regulars, period. Considering this is (a) the internet, and (b) a place where a bunch of computer nerds talk about science and philosophy I think that's actually quite good! It would be nice if the ratio is was a little better, but I don't think anyone here actually thinks it would ever get past 1:4. And I think we're okay with that. As long as it doesn't become like 1:100.
I wonder if we should focus so much on the gender inequality. Nowadays everything seems to operate under the assumption that gender equality in numbers is a desideratum. I don't know if we should operate under this assumption unless we want to signal that we are conforming to the Zeitgeist. If the site's purpose is rationality should it matter if there is a majority of males? I agree of course that females should be welcome and treated with respect, but the same applies to anybody. Midgets should also be welcome and treated with respect as should people who were born in airplanes over the Atlantic. And don't forget the people with green eyes and black hair, they too deserve respect.

I agree of course that females should be welcome and treated with respect,

This was the issue. The way PUA was being discussed made some women here feel unwelcome and disrespected.

If the site's purpose is rationality should it matter if there is a majority of males?

Of course not. No one expects there to ever be anything but a majority of males. But the community would be better off if the ratio wasn't as skewed as it is. Some reasons:

  1. Gender diversity means experience diversity and neuro-diversity, these things let us catch blind spots. The fact that we are men means there will be experiences we aren't aware of and it is helpful to have people with those experiences around to fill in the gaps. This of course goes for all kinds of socially significant diversity.

  2. Women, on average, appear to be less confrontational and aggressive in their discussions here (I don't know if this is learned or innate). People with such demeanors are good to have around as the rest of us appear to get our egos caught up in arguments a lot.

  3. One ostensible goal of this site is to help spread rationality. Alienating large segments of the the potential convert pool is a bad idea.

  4. The general consen

... (read more)
Huh? It's not like anyone is advocating affirmative action or extra karma for women. Some of the women here (in addition to some of the men) objected to the exclusionary, objectifying language, overly broad generalizations and ethically ambiguous advice that went hand in hand with the way some people were discussing Pick-Up Artistry. We want people with good ideas, no? Then if possible, let's avoid alienating groups of people who may have good ideas, and perhaps just as importantly, different ideas. Diversity is how you catch blind spots.
Sometimes movements change after they get founded. Arguing from founders is like arguing that a word's current meaning is the same as its derivation.
Why are you starting shit? We've had rather pleasant and informative discussions on pu and pu related topics this month.
I think this comment is at like +5 -8, unless someone is just changing their vote a lot. I've been trying to avoid asking about the occasional downvote, but this keeps fluctuating so I have to ask. I'm confused about what people dislike so much. I can't imagine the downvotes are just about using profanity. I only commented because didn't understand why we were diving into an inflammatory meta-conversation, complete with accusations of gender inequity, when the object-level conversations had been going fine.
I would guess that the profanity is enough to explain the low score. Aside from that, "Why are you starting shit" seems at least rude and also carries a host of connotations including a belligerent nature.
Jack, it wasn't my intention to start "shit"(using your words). I'm writing on PU based on what I remember reading about this in the past on this website, and I've been a member here since the days where this was still overcomingbias.com. I wasn't aware of the current discussion but even so I don't think it invalidates my point if you bother to read what I wrote.
Evidence for any of these claims would be swell.
See my other replies in this thread.
Please don't reopen the PUA argument. I think you may overestimate the difficulty. There clearly was some set of near-evolutionarily-feasible priors combined with a pattern of observations which transported any hypothetical dissenter into a particular belief - it strikes me as implausible to assume that such a person could not eventually convince many of us if that person were right. Given how many theists are regular contributors, I'm not yet convinced that this is a problem we need to be working on.
I think the example roland has in mind is the fact that he hasn't been able to convince us 9/11 was an inside job.
Honestly, I'd rather not reopen that one either - but mostly because I find it incredibly frustrating, rather than because it is mind-killing.
Oops... :-)
The importance of saying oops [http://lesswrong.com/lw/i9/the_importance_of_saying_oops/]
Ar least you get credit for perspicacity. (-;
The PUA argument is closed? I didn't get the memo. What was the final decision?
There wasn't one. See my reply to roland [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1xmp].
It's not my intention. But at the same time I wonder why you would be so opposed to it? That's exactly the kind of problem we are discussing here: not following arguments that are disapproved by the majority for some reason. In another comment of yours below you don't want to reopen the 9/11 thread either. Since there have been several top posts on this topic, all defending the orthodox viewpoint I think it would be more than fair to grant a chance for the dissenters. But don't worry, I'm not planning to do this, for now. The key here is eventually. Semmelweis [http://en.wikipedia.org/wiki/Ignaz_Semmelweis] proved that handwashing could diminish infections in clinics, yet it took over 20 years(and countless unnecessary deaths) for such a simple idea that could be easily tested to be finally accepted.

But at the same time I wonder why you would be so opposed to it?

Because politics is the mind-killer and almost every single conversation about pickup artistry immediately becomes infested with politically-charged claims. It's like that discussion about the correlation between race and intelligence that went to hell in a handbasket not long ago. If there be mines, don't go for a walk.

I actually feel like the last time it came up the discussion was really constructive- what began as a near flame war ended up as a friendly and informative discussion (see, in particular, Hughristik's comments). In this case the initiator was pretty clearly either biased or unable to communicate his reasons. He also used unsavory tactics. But I thought the discussion outside that particular poster was quite good: fair-minded and rational.
As I said to NancyLebovitz [http://lesswrong.com/lw/218/what_is_missing_from_rationality/1xn0]: 1. These few insightful posts have been the exception - most comments on PUA here have been much more incendiary. 2. roland hasn't shown any sign of being noninflammatory on the subject, much less insightful.
I think that depends on how you count. Most times PUA has been brought up, it has gone quietly, but the threads that have gone badly have generated a lot of comments.
I concede that I have performed no analysis of the distribution.
I don't remember ever writing much on the subject of PU, except the meta-comments in this post. Prove me wrong please?
I didn't say you had. But given that you've managed to inflame tempers with nothing but your meta-comments in this post, I think "inflammatory" is justified.
I thought that some progress actually got made. I got a better idea of the more benevolent end of the range of PUA, and PJ Elby and someone else stopped generalizing so much about women.
The "almost" in the "almost every", and I was impressed when I saw it. I do not believe I exaggerate when I claim that ten times as many comments failed where pjeby's succeeded. roland has not demonstrated the same kind of awareness - somewhat the opposite.
And it wasn't just a formal "almost every"-- there was a description of a sort of woman who'd been left out of the discussion. I'm willing to bet that his theory of typical and atypical women is still incomplete, but at least it includes a lot more of my experience. And I forgot to mention that I got a better understanding of a lot of the men who go in for PUA. OK-- there's that almost, but sooner or later, we have to work on being rational about difficult things. As I recall, what went wrong with the race and intelligence discussion was someone who kept making assertions with no evidence. I wouldn't be surprised if that person didn't know the what evidence was. It might have been a moderation problem. Banning people for utter cluelessness might have been the only solution. On the other hand, I don't think anyone tried to engage that person in a discussion of how they thought about evidence.

A good example is negotiation skills - I can't (offhand) recall a post discussing those directly.

Negotiation is generally regarded as one of the "soft" skills, and so often disregarded by thinkers of a more analytical stripe - yet we live in a world where negotiating with others who may not be as rational as you are can be a very fruitful way of advancing your personal goals.

Schelling's work is very directly concerned with both explicit and implicit negotiation. http://lesswrong.com/lw/14a/thomas_c_schellings_strategy_of_conflict/ [http://lesswrong.com/lw/14a/thomas_c_schellings_strategy_of_conflict/] http://lesswrong.com/lw/24o/eight_short_studies_on_excuses/ [http://lesswrong.com/lw/24o/eight_short_studies_on_excuses/] and many comments.

When society acts, it tends to benefit most when it acts in what I would call the Planning model of winning, where reward is a function of the accuracy of beliefs and the efficacy of explicitly reasoned plans

I'm not sure I agree with this. In fact I'm not quite sure what it means altogether. (What would I believe if I did in fact disagree with it?)

Could you try and clarify the contrast you're drawing here?

That society doesn't get rewarded according to "the accuracy of beliefs and the efficacy of explicitly reasoned plans". For example, if there were a God that rewarded (with e.g. prosperity, security and ease of living for all) all and only societies that were compassionate towards their lowest and worst off members, then the statement would be false. The reason is that in that case, the rewards (e.g. prosperity, security and ease of living for all) came without an explicitly reasoned out plan to get them, but nevertheless they are tied to what the society does.
I'm not sure what it means for society to be rewarded, or for society to benefit: I don't think of "society" as a reward-seeking agent. Societies persist, change, sometimes disappear. There is a class of beliefs and plans which have had large effects on societies - influencing their persistence, enacting large changes - and that is the class of scientific and technological beliefs. Perhaps it would be more useful to say that scientific and technological beliefs have large effects on how societies fare, but smaller effects on how individuals fare. I'm not sure how true that is, but it sounds more testable.
I think that the point is that the creators of LW/OB implicitly take that stance, that there is such a thing as a "better" outcome for the whole human race or the whole country, and that we ought to have better institutions to achieve these outcomes. And if you do take that stance, you end up with planning-esque rationality, because "scientific and technological beliefs have large effects on how societies fare, but smaller effects on how individuals fare"
Thanks for this comment, Morendil, your rephrase makes the point very clearly:

Spending lots of time thinking about concepts like cryonics, the Great Filter, the self-indication assumption, omega, etc. does not lead directly to traditionally desirable life outcomes.

If we wanted to be more traditionally successful, we would have more posts on what could be termed "quotidian" rationality, topics like investing, career planning, fitness, fashion, relationships and so on. But there are many other sites/magazines/books about that stuff; it's unclear how the rationalist viewpoint could help figure out a better (for example) diet... (read more)

I tentatively disagree. I'm actually working on a post about this very issue, with examples of the type you cite.
Well, it's a bit clearer if you remember that people are crazy and the world is mad [http://lesswrong.com/lw/17c/outlawing_anthropics_an_updateless_dilemma/13g0]. If everyone else is basing their diets on, say, the flow of moon spirits through their chakras, then I think rationality has something to offer. Imagine a nutritionist. Now imagine they know how to form accurate beliefs, unlike most people. See the improvement?
Sure, but nutrition claims to be a science, and they don't break obvious rules of rationality. It's not like they're developing diets based on the motions of the planets. Now, I don't have any confidence in any of their conclusions, but to do better would require more than mere philosophical sophistication; one would have to go out and gather actual data.
For nutrition in particular, I actually think epistemic techniques would be useful. The whole diet/exercise/weight loss cluster is a bit Wild West. I've read commercial gurus (who tend to be unscientific) and peer-reviewed studies (which tend to show a lack of practical knowledge, typically in that the "test" diet or exercise is often nowhere near as intense as what actual fitness buffs do.) Being aware of cognitive biases and having some crackpot-detecting mechanisms would actually be useful. Incidentally, since I realized that it can be hard to find suitable non-political examples for use here, nutrition might be a good substitute for climate change in examining how to look at "scientific consensus."
There's nothing wrong with basing your rationality on actual data, and I'd say it's a useful practice. As LW gets larger, we may want a split between general theory of rationality, longterm speculation, and practical application, just to give people more tools for finding what they're interested in.
I don't agree with your assessment. That is to say, I accept the 'science' part but not the 'rationality' part. Nutrition is based on politics, with the rational-rule breaking that politics entails.
There is quite a lot of evidence [http://www.amazon.com/gp/product/1400033462?ie=UTF8&tag=mattnewportco-20&linkCode=as2&camp=1789&creative=390957&creativeASIN=1400033462] that they have been rather bad at updating based on the data that has been collected.
(Pointless nit-picking: "Dietitian" is the protected term internationally - "nutritionist" isn't, in the U.S. or the U.K. Anyone can call themself a nutritionist [http://www.youtube.com/watch?v=VIaV8swc-fo].)

I definitely agree with this. A lot of my criticisms of the general kinds of things that get discussed around here (which I used to voice more often on OB) disappeared when I saw the sorts of problems they were being applied to. The "planning model of rationality" works remarkably well when applied to problems where you get to plan. My initial criticism of Bayesian methods for making decisions under uncertainty was that it doesn't work very well for most of our decisions, things like "Which path should I take across the room to retrieve m... (read more)

I think discussion of talent is generally lacking from rationality. Some clearly very irrational people are extremely successful. Sometimes it is due to luck, but even then it is usually the case that a large amount of talent was necessary to enter the lottery. With my particular combination of talents, no amount of learning the arts of rationality is going to turn me into a golfer like Tiger Woods or a media mogul like Rupert Murdoch.

The closest Roko's list comes to this sort of thing is microeconomics, which includes comparative advantage. Taking ... (read more)

Rational implementation is what we need more of. I wouldn't say planning is the "wrong kind of thought process". I'd say we have an abundance of planning tactics and a shortage of implementation tactics. Once you decide how to deal with your in-laws, how do you stay cool enough to actually do it?

The numerous posts on Akrasia are a big step in the implementation direction, though. We could use more vivid classifications of implementation problems like that, and techniques to deal with them.

"In a sufficiently mad world, being sane is actually a disadvantage"

To be sane with the usual limitation of a person is not enough. But to have a much saner civilization against some mad civilization, is a big advantage - per se. Guess who will likely win in a clash!

A sufficiently sane transhuman could deal with a mad civilization. The power is the sanity accumulated. Better, the rationality accumulated.

Typo in the second paragraph -- cannon / canon.

The natural evolution of the 'rationalist dojo' martial arts metaphor. What rational agent would limit itself to the hand to hand combat it begins with when it realises the potential for long range siege weaponry?
Steve Perry

When reading the title my response was "Nothing, but there are all sorts of potential problems in the stuff that you are implicitly adding to it". The use of the term here is as a symbol representing a bunch of cultural mores and attitudes that are distinct from what is contained in a definition of the world. If you must use 'rationality' to describe the problems you mention here then at least give it a capital 'R'. Much like of the two most significant political parties in Australia the conservative of the two is the 'Liberal' party and 'Freedom Fighters' do all sorts of things not necessarily optimised for furthering freedom.

[-][anonymous]13y -4


New to LessWrong?