Cult impressions of Less Wrong/Singularity Institute

LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.

Showing 3 of 5 replies (Click to show all)

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

Sacrificing or devoting? Those are different things. If FAI succeeds they will have a lot more life to party than they would have otherwise so devoting your life to FAI development might be a good bet even from a purely selfish standpoint.

7XiXiDu8yI have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk? It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do. Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?
16Wei_Dai8yYou mean when he saw himself in the mirror? :) Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?

Cult impressions of Less Wrong/Singularity Institute

by John_Maxwell 1 min read15th Mar 2012248 comments


If you type "less wrong c" or "singularity institute c" into Google, you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency. (EDIT: Please avoid testing this out, so Google doesn't autocomplete your search and reinforce their positions. This kind of problem can be hard to get rid of. Click these instead: less wrong cult, singularity institute cult.)
There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.

I have several questions related to this:

  • Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
  • If so, can you suggest any easy steps we could take?
  • Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
  • Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
  • Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.

If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:

Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Here are the worst violators I see on that about page:

Some people consider the Sequences the most important work they have ever read.

Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.

Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]

And on the sequences page:

If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.

This seems obviously false to me.

These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.

We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.

In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.