If you type "less wrong c" or "singularity institute c" into Google, you'll find that people are searching for "less wrong cult" and "singularity institute cult" with some frequency. (EDIT: Please avoid testing this out, so Google doesn't autocomplete your search and reinforce their positions. This kind of problem can be hard to get rid of. Click these instead: less wrong cult, singularity institute cult.)
There doesn't seem to be anyone arguing seriously that Less Wrong is a cult, but we do give some newcomers that impression.

I have several questions related to this:

  • Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
  • If so, can you suggest any easy steps we could take?
  • Is it possible that there are aspects of the atmosphere here that are driving away intelligent, rationally inclined people who might otherwise be interested in Less Wrong?
  • Do you know anyone who might fall into this category, i.e. someone who was exposed to Less Wrong but failed to become an enthusiast, potentially due to atmosphere issues?
  • Is it possible that our culture might be different if these folks were hanging around and contributing? Presumably they are disproportionately represented among certain personality types.

If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:

Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

Here are the worst violators I see on that about page:

Some people consider the Sequences the most important work they have ever read.

Generally, if your comment or post is on-topic, thoughtful, and shows that you're familiar with the Sequences, your comment or post will be upvoted.

Many of us believe in the importance of developing qualities described in Twelve Virtues of Rationality: [insert mystical sounding description of how to be rational here]

And on the sequences page:

If you don't read the sequences on Mysterious Answers to Mysterious Questions and Reductionism, little else on Less Wrong will make much sense.

This seems obviously false to me.

These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.

We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.

In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.

Cult impressions of Less Wrong/Singularity Institute
New Comment
247 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.

I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:

Eliezer: Please, learn what turns good ideas into cults, and avoid it!
Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!

&

Eliezer: Do not worship a hero! Do not trust!
Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.

Really, am I the only one seeing the problem with this?

People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they p... (read more)

LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.

How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?

"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."

I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.

(the way Will_Newsome wants you to),

I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person

You mean when he saw himself in the mirror? :)

Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?

I think it's not an ethical imperative unless you're unusually altruistic.

Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.

9Wei Dai
It's hard not to rewrite someone's goals while educating them, because one of our inborn drives is to gain the respect and approval of people around us, and if that means overwriting some of our goals, well that's a small price to pay as far as that part of our brain is concerned. For example, I stayed for about a week at the SIAI house a few years ago when attending the decision theory workshop, and my values shift in obvious ways just by being surrounded by more altruistic people and talking with them. (The effect largely dissipated after I left, but not completely.) Presumably the people they selected for the rationality mini-camp were already more altruistic than average, and the camp itself pushed some of them to the "unusually altruistic" level. Why should SIAI people have qualms about this (other than possible bad PR)?
-4TheAncientGeek
Pointing out that religious/cultic value rewriting is hard to avoid hardly refues the idea that LW is a cult.
9Vladimir_Nesov
I don't think "unusually altruistic" is a good characterization of "doesn't value personal preferences about some life choices more than the future of humanity"...
6cousin_it
Do you believe most people are already quite altruistic in that sense? Why? It seems to me that many people give lip service to altruism, but their actions (e.g. reluctance to donate to highly efficient charities) speak otherwise. I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.

False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.

Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.

(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)

I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.

3cousin_it
Yes, the reasoning in the linked posts implies that deep inside, humans should be as altruistic as you say. But why should I believe that reasoning? I'd feel a lot more confident if we had an art of rationality that made people demonstrably more successful in mundane affairs and also, as a side effect, made some of them support FAI. If we only get the side effect but not the main benefit, something must be wrong with the reasoning.
7Vladimir_Nesov
This is not what the posts are about, even if this works as one of the conclusions. The idea that urges and goals should be distinguished, for example, doesn't say what your urges or goals should be, it stands separately on its own. There are many such results, and ideas such as altruism or importance of FAI are only few among them. Do these ideas demonstrate comparatively more visible measurable effect than the other ideas?
2William_Quixote
if prediction markets were legal, we could much more easily measure if LW helped rationality. Just ask people to make n bets or predictions per month and see 1) it they did better than the population average and 2) if they improved over time. In fact, trying to get intrade legal in the US might be a very worthwhile project for just this reason ( beyond all the general social reasons to like prediction markets)
4gwern
There is no need to wish or strive for regulatory changes that may never happen: I've pointed out in the past that non-money prediction markets generally are pretty accurate and competitive with money prediction markets; so money does not seem to be a crucial factor. Just systematic tracking and judgment. (Being able to profit may attract some people, like me, but the fear of loss may also serve as a potent deterrent to users.) I have written at length about how I believe prediction markets helped me but I have been helped even more by the free active you-can-sign-up-right-now-and-start-using-it,-really,-right-now http://www.PredictionBook.com I routinely use LW-related ideas and strategies in predicting, and I believe my calibration graph reflects genuine success at predicting.
1cousin_it
Very nice idea, thanks! After some googling I found someone already made this suggestion in 2009.
0William_Quixote
If other people have suggested this before, there may be enouph background support to make it worth following up on this idea. When I get home from work, I will post in the discussion forum to see if people would be interested in working to legalize prediction markets ( like intrade) it the US. [EDITED: shortly after making this post, I saw Gwern’s post above suggesting that an alternative like prediction book would be just as good. As a result I did not make a post about legalizing prediction markets and instead tried prediction book for a month and a half. After this trial, I still think that making a push to legalize predictions markets would be worthwhile]
5Vaniver
It doesn't sound like you know all that many humans, then. In most times and places, the "future of humanity" is a signal that someone shouldn't be taken seriously, not an actual goal.
2Vladimir_Nesov
I was talking about the future of humanity, not the "future of humanity" (a label that can be grossly misinterpreted).
0Luke_A_Somers
... or you estimate the risk to be significant and you want to live past the next N years.

I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.

-4Tripitaka
Funny in which way? Do you want to avoid an automatic "makro-of-denial"-invocation or are you afraid of them joining Eliezers evergrowing crowd of memetically subverted FAI-lers ?
0cousin_it
The latter, I think.
0[anonymous]
If I teach rationality and deliberately change my students' goals, that means I fail as a teacher. It's even worse if their new goal happens to be donating all their money to my organization.
5XiXiDu
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk? It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do. Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?

If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.

5Will_Newsome
Indeed, Eliezer once told me that he was a lot more gung-ho about saving the world when he thought it just meant building AGI as quickly as possible.

I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...

This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.

— Nick Tarleton's twist on T.S. Eliot

9Tripitaka
1. Due to comparative advantage not changing much is actually a relativly good, straightforward strategy: just farm and redirect money. 2. As an example of these Altruistic Ones user Rain been mentioned, so they are out there. They all be praised! 3. Factor in time and demographics. A lot of LWlers are young people, looking for ways to make money; they are not able to spend much yet, and haven't had much impact yet. Time will have to show whether these stay true to their goals, or wether they are tempted to go the vicious path of always-growing investments into status.
0drethelin
I'm too irreparably lazy to actually change my life but my charitable donations are definitely affected by believing in FAI.
3[anonymous]
Sacrificing or devoting? Those are different things. If FAI succeeds they will have a lot more life to party than they would have otherwise so devoting your life to FAI development might be a good bet even from a purely selfish standpoint.
0[anonymous]
Pascal? Izzat you?
0[anonymous]
That comment doesn't actually argue for contributing to FAI development. So I guess I'm not Pascal (damn).
1[anonymous]
You probably don't wanna be Pascal anyway. I'm given to understand he's been a metabolic no-show for about 350 years.
1Grognor
I agree entirely. That post made me go "AAAH" and its rapid karma increase at first made me go "AAAAHH"

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users. I agree LW rocks in general. I think we're mostly talking past each other; I don't see this discussion post as fitting into the genre of "serious LW criticism" as the other stuff you link to.

In other words, I'm talking about first impressions, not in-depth discussions.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme. That sounds pretty implausible to me. Keep in mind that no one who is fully familiar with LW is making this accusation (that I know of), but it does look like it might be a reaction that sometimes occurs in newcomers.

Let's keep in mind that LW being bad is a logically distinct proposition, and if it is bad, we want to know it (since we want to know what is true right?)

And if we can make optimizations to LW culture to broaden participation from intelligent people, that's also something we want to do, right? Although, on reflection, I'm not sure I see an opportunity for improvement where this is concerned, except maybe on the wiki (but I do think ... (read more)

My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users.

Okay.

If we want to win, it might not be enough to have a book length document explaining why we're not a cult. We might have to play the first impressions game as well.

I said stop talking about it and implied that maybe it shouldn't have been talked about so openly in the first place, and here you are talking about it.

I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme.

Where else could it have come from? Eliezer's extensive discussion of cultish behavior gets automatically pattern-matched into helpless cries of "LW is not a cult!" (even though that isn't what he's saying and isn't what he's trying to say), and this gets interpreted as, "LW is a cult." Seriously, any time you put two words together like that, people assume they're actually related.

Elsewise, the only thing I can think of is our similar demographics and a horribly mistaken impression that we all agree on everything (I don't know where this comes from).

Criticism rocks dude.

Okay. (I hope you didn't interpret anything I said as meaning otherwise.)

8John_Maxwell
Point taken; I'll leave the issue alone for now.
5Antisuji
Ya know, if LW and SIAI are serious about optimizing appearances, they might consider hiring a Communications professional. PR is a serious skill and there are people who do it for a living. Those people tend to be on the far end of the spectrum of what we call neurotypical here. That is, they are extremely good at modeling other people, and therefore predicting how other people will react to a sample of copy. I would not be surprised if literally no one who reads LW regularly could do the job adequately. Edit to add: it's nice to see that they're attempting to do this, but again, LW readership is probably the wrong place to look for this kind of expertise.
8wedrifid
People who do this for a living (effectively) cost a lot of money. Given the budget of SIAI putting a communications professional on the payroll at market rates represents a big investment. Transitioning a charity to a state where a large amount of income goes into improving perception (and so securing more income) is a step not undertaken lightly.

It's at least plausible that a lot of the people who can be good for SIAI would be put off more by professional marketing than by science fiction-flavored weirdness.

2Antisuji
That's a good point. I'm guessing though that there's a lot of low hanging fruit, e.g. a front page redesign, that would represent a more modest (and one-time) expense than hiring a full-time flack. In addition to costing less this would go a long way to mitigate concerns of corruption. Let's use the Pareto Principle to our advantage!

AAAAARRRGH! I am sick to death of this damned topic.

It looks a bit better if you consider the generalization in the intro to be mere padding around a post that is really about several specific changes that need to be made to the landing pages.

5John_Maxwell
Unfortunately, Grognor reverts me every time I try to make those changes... Bystanders, please weigh in on this topic here.
3Vladimir_Nesov
I didn't like your alternative for the "Many of us believe" line either, even though I don't like that line (it was what I came up with to improve on Luke's original text). To give the context: the current About page introduces twelve virtues with: John's edit was to change it to: P.S. I no longer supervise the edits to the wiki, but someone should...
0John_Maxwell
He didn't like my other three attempts at changes either... I could come up with 10 different ways of writing that sentence, but I'd rather let him make some suggestions.
4wedrifid
If you made the suggestions here and received public support for one of them it wouldn't matter much what Grognor thought.
0John_Maxwell
Why don't you make a suggestion?
5wedrifid
*cough* Mine is 'delete the sentence entirely'. I never really liked that virtues page anyway!
3John_Maxwell
Sounds like a great idea.
2lessdazed
I entirely agree with this.
2John_Maxwell
To be clear, you are in favor of leaving the virtues off of the about page, correct?
1wedrifid
For what it is worth, yes.
0John_Maxwell
Okay, thanks. One of the other wiki editors didn't think you meant that.
2Vladimir_Nesov
Whatever wedrifid actually meant is not "apparent consensus", given that there's just 2 upvotes on the statement where it wasn't apparent to the voters what he actually meant... Reverted with suggestion to escalate to a discussion post and voting more clearly. Also, this started from talking about bad wording, which is a separate question from leaving the section out altogether, so the hypothetical discussion posting should distinguish those questions.
-2John_Maxwell
Okay.
-1wedrifid
That change is less bad than the original but it is sometimes better to hold off on changes that may reduce the impetus for further improvement without quite satisfying the need.
0John_Maxwell
To be honest, I don't have much energy left to fight this. I'd like to rethink the entire page, but if I have to fight tooth and nail for every sentence I won't.
-1wedrifid
Who on earth is Grognor?
7Grognor
Hi?
1Nisan
In. Who in earth.
1wedrifid
Is this a jest about Grognor sounding like the name of a dwarf or a mythical beast of the depths?
4Nisan
I'm afraid so.

A rambling, cursing tirade against a polite discussion of things that might be wrong with the group (or perceptions of the group) doesn't improve my perception of the group. I have to say, I have a significant negative impression from Grognor's response here. In addition to the tone of his response, a few things that added to this negative impression were:

"how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it"

Again, the name dropping of Our Glorious Leader Eliezer, long may He reign. (I'm joking here for emphasis.)

"LW is a cult hur hur"

People might not be thinking completely rationally, but this kind of characterization of people who have negative opinions of the group doesn't win you any friends.

"since it's exactly what Eliezer was trying to combat by writing it."

There's Eliezer again, highlighting his importance as the group's primary thought leader. This may be true, and probably is, but highlighting it all the time can lead people to think this is cultish.

5XiXiDu
Thanks for saying that I significantly helped to make Less Wrong look less cultish ;-) By the way...
2jimrandomh
Actually, I believe what he said was that you generated evidence that Less Wrong is not cultish, which makes it look more cultish to people who aren't thinking carefully.
5dbaupp
A widely revered figure who has written a million+ words that form the central pillars of LW and has been directly (or indirectly) responsible for bringing many people into the rationality memespace says "don't do X" so it is obvious that X must be false. Dismissing accusations of a personality cult around Eliezer by saying Eliezer said "no personality cult" is a fairly poor way of going about it. Two key points: * saying "as a group, we don't worship Eliezer" doesn't guarantee that it is true (groupthink could easily suck us into ignore evidence) * someone might interpret what Eliezer said as false modesty or an attempt to appear to be a reluctant saviour/messiah (i.e. using dark arts to suck people in)
2epicureanideal
"I have become fully convinced that even bringing it up is actively harmful." What evidence leads you to this conclusion? Can you provide evidence to support this characterization? Can you provide evidence to support this characterization? I would like to see some empirical analysis of the points made here and by the original poster. We should gather some data about perceptions from real users and use that to inform future discussion on this topic. I think we have a starting point in the responses to this post, and comments in other posts could probably be mined for information, but we should also try to find some rational people who are not familiar with less wrong and introduce them to it and ask them for their impressions (from someone acting like they just found the site, are not affiliated with it, and are curious about their friend's impressions, or something like that).
2XiXiDu
No, it is not. A lack of self-criticism and evaluation is one of the reasons for why people assign cult status to communities. P.S. Posts with titles along the lines of 'Epistle to the New York Less Wrongians' don't help in reducing cultishness ;-) (Yeah, I know it was just fun.)
0halcyon
Actually, I believe the optimal utilitarian attitude would be to make fun of them. If you don't take them at all seriously, they will grow to doubt themselves. If you're persistently humorous enough, some of them, thinking themselves comedians, will take your side in poking fun at the rest. In time, LW will have assembled its own team of Witty Defenders responsible for keeping non-serious accusations at bay. This will ultimately lead to long pages of meaningless back and forth between underlings, allowing serious LWians to ignore these distracting subjects altogether. Also, the resulting dialogue will advertize the LW community, while understandably disgusting self-respecting thinkers of every description, thus getting them interested in evaluating the claims of LW on its own terms. Personally, I think all social institutions are inevitably a bit cultish, (society = mob - negative connotations) and they all use similarly irrational mechanisms to shield themselves from criticism and maintain prestige. A case could be made that they have to, one reason being that most popular "criticism" is of the form "I've heard it said or implied that quality X is to be regarded as a Bad Thing, and property Y of your organization kind of resembles X under the influence of whatever it is that I'm smoking," or of equally abysmal quality. Heck, the United States government, the most powerful public institution in the world, is way more cultish than average. Frankly, more so than LW has ever been accused of being, to my knowledge. Less Wrong: Less cultish than America!

The top autocompletes for "Less Wrong" are

  • sequences
  • harry potter
  • meetups

These are my (logged-in) Google results for searching "Less Wrong_X" for each letter of the alphabet (some duplicates appear):

  • akrasia
  • amanda knox
  • atheism
  • australia
  • blog
  • bayes
  • basilisk
  • bayes theorem
  • cryonics
  • charity
  • cult
  • discussion
  • definition
  • decoherence
  • decision theory
  • epub
  • evolutionary psychology
  • eliezer yudkowsky
  • evidence
  • free will
  • fanfiction
  • fanfic
  • fiction
  • gender
  • games
  • goals
  • growing up is hard
  • harry potter
  • harry potter and the methods of rationality
  • how to be happy
  • hindsight bias
  • irc
  • inferential distance
  • iq
  • illusion of transparency
  • joint configurations
  • joy in the merely real
  • kindle
  • amanda knox
  • lyrics
  • luminosity
  • lost purposes
  • leave a line of retreat
  • meetup
  • mobi
  • meditation
  • methods of rationality
  • newcomb's problem
  • nyc
  • nootropics
  • neural categories
  • optimal employment
  • overcoming bias
  • open thread
  • outside the laboratory
  • procrastination
  • pdf
  • polyamory
  • podcast
  • quantum physics
  • quotes
  • quantum mechanics
  • rationality quotes
  • rationality quotes
  • rationalwiki
  • reading list
  • rationality
  • sequences
  • survey
  • survey results
  • sequences pdf
  • twitter
  • textbooks
  • three worlds collide
  • toronto
  • ugh fields
  • universal fire
  • v
... (read more)
0timtyler
Luke's link to How Cults work is pretty funny.

Google's autocomplete has a problem, which has produced controversy in other contexts: when people want to know whether X is trustworthy, the most informative search they can make is "X scam". Generally speaking, they'll find no results and that will be reassuring. Unfortunately, Google remembers those searches, and presents them later as suggestions - implying that there might be results behind the query. Once the "X scam" link starts showing up in the autocomplete, people who weren't really suspicious of X click on it, so it stays there.

6beoShaffer
Personal anecdote warning. I semi-routinely google the phrase "X cult" when looking into organizations.

Does this ever work?

3beoShaffer
I think so, but it's hard to say. I look into organizations infrequently enough that semi-routinely leaves me a very small sample size. The one organization that had prominent cult results(not going to name it for obvious reasons) does have several worrying qualities. And they seem related to why it was called a cult. -edit minor grammar/style fix
2John_Maxwell
Thanks; I updated the post to reflect this.

Eliezer addressed this in part with his "Death Spiral" essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:

  • Having a house where core members live together.
  • Asking followers to completely adjust their thinking processes to include new essential concepts, terminologies, and so on to the lowest level of understanding reality.
  • Claiming that only if you carry out said mental adjustment can you really understand the most important parts of the organization's philosophy.
  • Asking for money for a charity, particularly one which does not quite have the conventional goals of a charity, and claiming that one should really be donating a much larger percentage of one's income than most people donate to charity.
  • Presenting an apocalyptic scenario including extreme bad and good possibilities, and claiming to be the best positioned to deal with it.
  • [Added] Demand you leave any (other) religion.

Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one's approach just for appearances.

Perhaps consider adding the high fraction of revenue that ultimately goes to paying staff wages to the list.

Oh yes, and fact that the leader wants to SAVE THE WORLD.

5Bongo
About a third in 2009, the last year for which we have handy data.
1timtyler
Practically all of it goes to them or their "associates" - by my reckoning. In 2009 some was burned on travel expenses and accomodation, some was invested - and some was stolen. Who was actually helped? Countless billions in the distant future - supposedly.
8dbaupp
What else should it go to? (Under the assumption that SI's goals are positive.) As Larks said above, they are doing thought work: they are not trying to ship vast quantities of food or medical supplies. The product of SI is the output from their researchers, the only way to get more output is to employ more people (modulo improving the output of the current researchers, but that is limited).
6timtyler
So, to recap, this is a proposed part of a list of ways in which the SIAI resembles a cult. It redistribtutes economic resources from the "rank and file" members up the internal heirarchy without much expenditure on outsiders - just like many cults do.
5dbaupp
(Eh. Yes, I think I lost track of that a bit.) Keeping that in mind: SI has a problem because acting to avoid appearing to exist to give money to the upper ranks means that they can't pay their researchers. There are three broad classes of solutions to this (that I can see): * Give staff little to no compensation for their work * Use tricky tactics to try to conceal how much money goes to the staff * Try to explain to everyone why such a large proportion of the money goes to the staff All of those seem suboptimal.
5epicureanideal
Why was this downvoted instead of responded to? Downvoting people who are simply stating negative impressions of the group doesn't improve impressions of the group.
5JoshuaFox
Most organizations spend most of their money on staff. What else could you do with it? Paying fellowships for "external staff" is a possibility. But in general, good people are exactly what you need.
2timtyler
Often goods or needy beneficiaries are also involved. Charity actions are sometimes classified into: * Program Expenses * Administrative Expenses * Fundraising Expenses This can be used as a heuristic for identifying good charities. Not enough in category 1 and too much in categories 2 and 3 is often a bad sign.
[-]Larks120

But they're not buying malaria nets, they're doing thought-work. Do you expect to see an invoice for TDT?

Quite appart from the standard complaint about how awful a metric that is.

-1epicureanideal
And yet there are plenty of things that don't cost much money that they could be doing right now, that I have previously mentioned to SIAI staff and will not repeat (edit: in detail) because it might interfere with my own similar efforts in the near future. Basically I'm referring to public outreach, bringing in more members of the academic community, making people aware that LW even exists (I wasn't except when I randomly ran into a few LWers in person), etc. What's the reason for downvoting this? Please comment.
3epicureanideal
As I've discussed with several LWers in person, including some staff and visiting fellows, one of the things I disliked about LW/SIAI was that so much of the resources of the organization go to pay the staff. They seemingly wouldn't even consider proposals to spend a few hundred dollars on other things because they claimed it was "too expensive".
5TheAncientGeek
add * Leader(s) are credited with expertise beyond that convenrional experts in subjects they are not conventionally qualified in. * Studying conventional versions of subjects is deprecated in favour of in group versions.
3JoshuaFox
Also: Associated with non-standard and non-monogamous sexual practices. (Just some more pattern-matching on top of what you see in the parent and grandparent comment. I don't actually think this is a strong positive indicator.)
3TheAncientGeek
The usual version of that indicator is "leader has sex with followers"
2MTGandP
One fundamental difference between LW and most cults is that LW tells you to question everything, even itself.
7gwern
Most, but not all. The Randians come to mind. Even the Buddha encouraged people to be critical, but doesn't seem to have stopped the cults. I was floored to learn a few weeks ago that Buddhism has formalized even when you stop doubting! When you stop doubting, you become a Sotāpanna; a Sotāpanna is marked by abandoning '3 fetters', the second fetter according to Wikipedia being As well, as unquestioningness becomes a well known trait of cults, cults tend to try to hide it. Scientology hides the craziest dogmas until you're well and hooked, for example.
2[anonymous]
If the Randians are a cult, LW is a cult. Like the others, the members just think it's unique in being valid.
2Desrtopa
If a person disagrees with Rand about a number of key beliefs, do they still count as a Randian?
0Peterdjones
If they don't count as an Orthodox Randian, they can always become a Liberal Randian
0[anonymous]
That depends the largest part on what "a number of key beliefs" is.
-2MugaSofer
Could you elaborate on this?
0MTGandP
So there comes a point in Buddhism where you're not supposed to be skeptical anymore. And Objectivists aren't supposed to question Ayn Rand.
1Mitchell_Porter
Would it be productive to be skeptical about whether your login really starts with the letter "M"? Taking an issue off the table and saying, we're done with that, is not in itself a bad sign. The only question is whether they really do know what they think they know. I personally endorse the very beginning of Objectivist epistemology - I mean this: "Existence exists—and the act of grasping that statement implies two corollary axioms: that something exists which one perceives and that one exists possessing consciousness, consciousness being the faculty of perceiving that which exists." It's the subsequent development which is a mix of further gemlike insights, paths not taken, and errors or uncertainties that are papered over. In the case of Buddhism, one has the usual problem of knowing, at this historical distance, exactly what psychological and logical content defined "enlightenment". One of its paradoxes is that it sounds like the experience of a phenomenological truth, and yet the key realization is often presented as the discovery that there is no true self or substantial self. I would have thought that achieving reflective consciousness implied the existence of a reflector, just as in the Objectivist account. Then again, reflection can also produce awareness that traits with which you have identified yourself are conditioned and contingent, so it can dissolve a naive concept of self, and that sounds more like the Buddhism we hear about today. The coexistence of a persistent observing consciousness, and a stream of transient identifications, in certain respects is like Hinduism; though the Buddhists can strike back by saying that the observing consciousness is not eternal and free of causality, it too exists only if it has been caused to exist. So claims to knowledge, and the existence of a stage where you no longer doubt that this really is knowledge, and get on with developing the implications, do not in themselves imply falsity. In a systematic philosophy
0timtyler
There seems to be some detailed substructure there - which I go over here.
-1timtyler
Not just a cult - an END OF THE WORLD CULT. My favourite documentary on the topic: The End of The World Cult.

Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?

I only discovered LW about a week ago, and I got the "cult" impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.

For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.

AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.

There's also the pro... (read more)

Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.

Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).

Upvoted for sounding a lot like the kinds of complaints I've heard people say about LW and SIAI.

There is a large barrier to entry here, and if we want to win more, we can't just blame people for not understanding the message. I've been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:

  • We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren't exactly signalling openness or friendliness.

  • We have a tendency to preach to the converted. Or as the friend said:

    It's that classic mistake of talking in a way where you're convincing or explaining something to yourself or the well-initiated instead of laying out the roadwork for foreigners.

    He brought up an example for how material might be introduced to newly exposed folk.

    If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.

The curse of knowledg... (read more)

If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.

That's an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary-- when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I'd heard in the course of a lot of years of listening to NPR.

I'm not saying it's so magical that it can't be equaled, I'm saying that it might be worth studying.

Here's what an outsider might see:

"doomsday beliefs" (something "bad" may happen eschatologically, and we must work to prevent this): check

a gospel (The Sequences): check

vigorous assertions of untestable claims (Everett interpretation): check

a charismatic leader extracting a living from his followers: check

is sometimes called a cult: check

This is enough to make up a lot of minds, regardless of any additional distinctions you may want to make, sadly.

5[anonymous]
But an outsider would have to spend some time here to see all those things. If they think LW is accurately described by the c-word even after getting acquainted with the site, there might be no point in trying to change their minds. It's better to focus on people who are discouraged by first impressions.
2advancedatheist
I recently read an article about Keith Raniere, the founder of a cult called NXIVM (pronounced "nexium"): http://www.timesunion.com/local/article/Secrets-of-NXIVM-2880885.php Raniere reminds me of Yudkowsky, especially after reading cult expert Rick Ross's assessment of Raniere:
[-]gRR190

I'm here for only a couple of months, and I didn't have any impression of cultishness. I saw only a circle of friends doing a thing together, and very enthusiastic about it.

What I also did see (and still do) is specific people just sometimes being slightly crazy, in a nice way. As in: Eliezer's threatment of MWI. Or way too serious fear of weird acausal dangers that fall out of currently best decision theories.
Note: this impression is not because of craziness of the ideas, but because of taking them too seriously too early. However, the relevant posts always have sane critical comments, heavily upvoted.

I'm slightly more alarmed by posts like How would you stop Moore's Law?. I mean, seriously thinking of AI dangers is good. Seriously considering nuking Intel's fabs in order to stop the dangers is... not good.

3Luke_A_Somers
Agreed, except the treatment of WMI does not seem the least bit crazy to me. But what do I know - I'm a crazy physicist.
5roystgnr
The conclusions don't seem crazy (well, they seem "crazy-but-probably-correct", just like even the non-controversial parts of quantum mechanics), but IIRC the occasional emphasis on "We Have The One Correct Answer And You All Are Wrong" rang some warning bells. On the other hand: Rationality is only useful to the extent that it reaches conclusions that differ from e.g. the "just believe what everyone else does" heuristic. Yet when any other heuristic comes up with new conclusions that are easily verified, or even new conclusions which sound plausible and aren't disproveable, "just believe what everyone else does" quickly catches up. So if you want a touchstone for rationality in an individual, you need to find a question for which rational analysis leads to an unverifiable, implausible sounding answer. Such a question makes a great test, but not such a great advertisement...
0Dmytry
Choosing between mathematically equivalent interpretations adds 1 bit of complexity that doesn't need to be added. Now, if EY had derived the Born probabilities from first principles, that'd be quite interesting.
2wedrifid
That's a positive impression. People really look that enthusiastic and well bonded?
8gRR
Yes to well bonded. People here seem to understand each other far better than average on the net, and it is immediately apparent. Enthusiastic is a wrong word, I suppose. I meant, sure of doing a good thing, happy to be doing it, etc, not in the sense of applauding and cheering.
4wedrifid
Thankyou. It is good to be reminded that these things are relative. Sometimes I forget to compare interactions to others on the internet and instead compare them to interactions with people as I would prefer them to be or even just interactions with people I know in person (and have rather ruthlessly selected for not being annoying).

Speaking for myself, I know of at least four people who know of Less Wrong/SI but are not enthusiasts, possibly due to atmosphere issues.

An acquaintance of mine attends Less Wrong meetups and describes most of his friends as being Less Wrongers, but doesn't read Less Wrong and privately holds reservations about the entire singularity thing, saying that we can't hope to say much about the future more than 10 years in advance. He told me that one of his coworkers is also skeptical of the singularity.

A math student/coder I met at an entrepreneurship event told me Less Wrong had good ideas but was "too pretentious".

I was interviewing for an internship once, and the interviewer and I realized we had a mutual acquaintance who was a Less Wronger and SI donor. He asked me if I was part of that entire group, and I said yes. His attitude was a bit derisive.

5timtyler
The FHI are trying to do a broadly similar thing from within academia. They seem less kooky and cultish - probably as a result of trying harder to avoid cultishness.
1[anonymous]
I don't know why you would assume that it's "probably as a result of trying harder to avoid cultishness." My prior is that they just don't seem cultish because academics are often expected to hold unfamiliar positions.
5BrandonReinhart
I will say that I feel 95% confident that SIAI is not a cult because I spent time there (mjcurzi was there also), learned from their members, observed their processes of teaching rationality, hung out for fun, met other people who were interested, etc. Everyone involved seemed well meaning, curious, critical, etc. No one was blindly following orders. In the realm of teaching rationality, there was much agreement it should be taught, some agreement on how, but total openness to failure and finding alternate methods. I went to the minicamp wondering (along with John Salvatier) whether the SIAI was a cult and obtained lots of evidence to push me far away from that position. I wonder if the cult accusation in part comes from the fact that it seems too good to be true, so we feel a need for defensive suspicion. Rationality is very much about changing one's mind and thinking about this we become suspicious that the goals of SIAI are to change our minds in a particular way. Then we discover that in fact the SIAI's goals (are in part) to change our minds in a particular way so we think our suspicions are justified. My model tells me that stepping into a church is several orders of magnitude more psychologically dangerous than stepping into a Less Wrong meetup or the SIAI headquarters. (The other 5% goes to things like "they are a cult and totally duped me and I don't know it", "they are a cult and I was too distant from their secret inner cabals to discover it", "they are a cult and I don't know what to look for", "they aren't a cult but they want to be one and are screwing it up", etc. I should probably feel more confident about this than 95%, but my own inclination to be suspicious of people who want to change how I think means I'm being generous with my error. I have a hard time giving these alternate stories credit.)
4daenerys
I would consider myself a pretty far outlier on LessWrong (as a female, ENFP (people-person, impulsive/intuitive), Hufflepuff type). So on one hand, my opinion may mean less, because I am not generally the "type" of person associated with LW. On the other hand, if you want to expand LW to more people, then I think some changes need to be made for other "types" of people to also feel comfortable here. Along with the initial "cult" impression (which eventually dissipates, IMO), what threw me most is the harshness of the forums. I've been on here for about 4 months now, and it's still difficult for me to deal with. Also, I agree that topics like FAI, and Singularitarianism aren't necessarily the best things to be discussing when trying to get people interested in rationality. I am well-aware that the things that would make LW more comfortable for me and others like me, would make it less comfortable for many of the current posters. So there is definitely a conflict of goals. Goal A- Grow LW and make rationality more popular- Need to make LW more "nice" and perhaps focused on Instrumental Rationality rather than Singularity and FAI issues. Goal B- Maintain current culture and level of posts.- Need to NOT significantly change LW, and perhaps focus more on the obscure posts that are extremely difficult for newer people to understand. AFAICT pursuit of either of these goals will be at the detriment of the other goal.
0NancyLebovitz
Could you be more specific about what comes off as harsh to you? If you'd rather address this as a private message, I'm still interested.
5John_Maxwell
What comes across as harsh to me: down voting discussion posts because they're accidental duplicates/don't fit some idea of what a discussion post is supposed to be, a lot of down voting that goes on in general, unbridled or curt disagreement (like grognor's response to my post. You saw him cursing and yelling, right? I made this post because I thought the less wrong community could use optimization on the topics I wrote about, not because I wanted to antagonize anyone.)
0daenerys
PM'd response. General agreement with John below (which I didn't see until just now).
3Nisan
This person might have been in the same place as a math grad student I know. They read a little Less Wrong and were turned off. Then they attended a LW-style rationality seminar and responded positively, because it was more "compassionate". What they mean is this: A typical epistemology post on Less Wrong might sound something like (That's not a quote.) Whereas the seminar sounded more like Similarly, an instrumental-rationality post here might sound like Whereas the seminar sounds more like Of course, both approaches are good and necessary, and you can find both on Less Wrong.

Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that's exactly what a cult would do, isn't it?

To be accused is to be convicted, because the allegation is unfalsifiable.

Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.

The best way to win is to avoid the topic.

Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: "So, what exactly is that evil thing people on LW did? Downvote someone's forum post? Seriously, guys, you need to get some life."

And now, everybody stop worrying and get some life. ;-)

It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discu... (read more)

People discussing best diet habits don't seem like a doomsday cult, right?

I'm having trouble thinking up examples of cults, real or fictional, that don't take an interest in what their members eat and drink.

2epicureanideal
I don't think the best way to win is to avoid the topic. A healthy discussion of false impressions and how to correct them, or other failings a group may have, is a good indication to me of a healthy community. This post for example caused my impression of LW to increase somewhat, but some of the responses to it have caused my impression to decrease below its original level.
6Viliam_Bur
Then let's discuss "false impressions" or even better "impressions" in general, not focusing on cultishness, which even cannot be defined (because there are so many different kind of cults). If we focus on making things right, we do not have to discuss hundred ways they could go wrong. What is our community (trying to be) like? Friendly. In more senses of the word: we speak about ethics, we are trying to make a nice community, we try to help each other become stronger and win. Rational. Instead of superstition and gossip, we discuss how and why things really happen. Instead of happy death spirals, we learn about the world around us. Professional. By that I do not mean that everyone here is an AI expert, but that the things we do and value here (studying, politeness, exactness, science) are things that for most people correlate positively with their jobs, rather than free time. Even when we have fun, it's adult people having fun. So where exactly in the space of human organizations do we belong? Which of the cached-thoughts can be best applied to us? People will always try to fit us to some existing model (for example: cult), so why not choose this model rationally? I am not sure, but "educational NGO" sounds close. Science, raising the sanity waterline, et cetera. By seeming as something well-known, we become less suspicious, more normal.
1MugaSofer
This. Seriously, we need to start doing all the stuff recommended here, but this is perhaps the simplest and most immediate. Someone go do it.
-12Peterdjones
[-][anonymous]160

Some things that might be problematic:

We use the latest insights from cognitive science, social psychology, probability theory, and decision theory to improve our understanding of how the world works and what we can do to achieve our goals.

I don't think we actually do that. Insights, sure, but latest insights? Also, it's mostly cognitive science and social psychology. The insights from probability and decision theory are more in the style of the simple math of everything.

Want to know if your doctor's diagnosis is correct? It helps to understand Bayes' Theorem.

This might sound weird to someone who hasn't already read the classic example about doctors not being able to calculate conditional probabilities. Like we believe Bayes theorem magically grants us medical knowledge or something.

[the link to rationality boot-camp]

I'm not a native speaker of English so I can't really tell, but I recall people complaining that the name 'boot-camp' is super creepy.

On the about page:

Introduce yourself to the community here.

That's not cultish-sounding but it's unnecessarily imperative. Introduction thread is optional.

Disclaimer: My partner and I casually refer to LW meetups (which I attend and she does not) as "the cult".

That said, if someone asked me if LW (or SIAI) "was a cult", I think my ideal response might be something like this:

"No, it's not; at least not in the sense I think you mean. What's bad about cults is not that they're weird. It's that they motivate people to do bad things, like lock kids in chain-lockers, shun their friends and families, or kill themselves). The badness of being a cult is not being weird; it's doing harmful things — and, secondarily, in coming up with excuses for why the cult gets to do those harmful things. Less Wrong is weird, but not harmful, so I don't think it is a cult in the sense you mean — at least not at the moment.

"That said, we do recognize that "every cause wants to be a cult", that human group behavior does sometimes tend toward cultish, and that just because a group says 'Rationality' on the label does not mean it contains good thinking. Hoping that we're special and that the normal rules of human behavior don't apply to us, would be a really bad idea. It seems that staying self-critical, understanding how ... (read more)

What's bad about cults is not that they're weird. It's that they motivate people to do bad things...

People use "weird" as a heuristic for danger, and personally I don't blame them, because they have good Bayesian reasons for it. Breaking a social norm X is positively correlated with breaking a social norm Y, and the correlation is strong enough for most people to notice.

The right thing to do is to show enough social skill to avoid triggering the weirdness alarm signal. (Just like publishing in serious media is the right thing to avoid the "pseudoscience" label.) You cannot expect that outsiders will do an exception for LW, suspend their heuristics and explore the website deeply; that would be asking them to privilege a hypothesis.

If something is "weird", we should try to make it less weird. No excuses.

-1ryjm
So we should be Less Weird now? ;)
1Viliam_Bur
We should be winning. Less Weird is a good heuristic for winning (though a bad heuristic for a site name ).

Often by the time a cult starts doing harmful things, its members have made both real and emotional investments that turn out to be nothing but sunk costs. To avoid ever getting into such a situation, people come up with a lot of ways to attempt to identify cults based on nothing more than the non-harmful, best-foot-forward appearance that cults first try to project. If you see a group using "love bombing", for instance, the wise response is to be wary - not because making people feel love and self-esteem is inherently a bad thing, but because it's so easily and commonly twisted toward ulterior motives.

2CasioTheSane
That is until people start bombing factories to mitigate highly improbable existential risks.

Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?

What do you mean, "initially" ? I am still getting that impression ! For example, just count the number of times Eliezer (who appears to only have a single name, like Prince or Jesus) is mentioned in the other comments on this post. And he's usually mentioned in the context of, "As Eliezer says...", as though the mere fact that it is Eliezer who says these things was enough.

The obvious counter-argument to the above is, "I like the things Eliezer says because they make sense, not because I worship him personally", but... well... that's what one would expect a cultist to say, no ?

Less Wrongers also seem to have their own vocabulary ("taboo that term or risk becoming mind-killed, which would be un-Bayesian"). We spend a lot of time worrying about doomsday events that most people would consider science-fictional (at best). We also cultivate a vaguely menacing air of superiority, as we talk about uplifting the ignorant masses by spreading our doctrine of rationality. As far as warning signs go, we've got it covered...

Specialized terminology is really irritating to me personally, and off-putting to most new visitors I would think. If you talk to any Objectivists or other cliques with their own internal vocabulary, it can be very bothersome. It also creates a sense that the group is insulated from the rest of the world, which adds to the perception of cultishness.

8[anonymous]
I think the phrase 'raising the sanity waterline' is a problem. As is the vaguely religious language, like 'litany of Tarski'. I looked up the definiton of 'litany' to make sure I was picking up on a religious denotation and not a religious connotation, and here's what I got: Not a great word, I think. Also 'Bayesian Conspiracy.' There's no conspiracy, and there shouldn't be.

Agreed. I realize that the words like "litany" and "conspiracy" are used semi-ironically, but a newcomer to the site might not.

3William_Quixote
This wording may lose a few people, but it probably helps for many people as well. The core subject matter of rationality could very easily be dull or dry or "academic". The tounge-in-cheek and occasionally outright goofy humor makes the sequences a lot more fun to read. The tone may have costs, but not being funny has costs too. If you think back to college, more professors have students tune out by being boring than by being esoteric.
1Jiro
(Responding to old post.) One problem with such ironic usage is that people tend to joke about things that cause themselves stress, and that includes uncomfortable truths or things that are getting too close to the truth. It's why it actually makes sense to detain people making bomb jokes in airports. So just because the words are used ironically doesn't mean they can't reasonably be taken as signs of a cult--even by people who recognize that they are being used ironically. (Although this is somewhat mitigated by the fact that many cults won't allow jokes about themselves at all.)
2Eneasz
You'd have to be new to the entire internet to think those are being used seriously. And if you're THAT new, there's really very little that can be done to prevent misunderstanding no matter where you first land. On top of that, it's extremely unlikely someone very new to the internet would start their journey at LessWrong
0Martin-2
Mr. Jesus H. Christ is a bad example. Also there's this.