AAAAARRRGH! I am sick to death of this damned topic. It has been done to death.
I have become fully convinced that even bringing it up is actively harmful. It reminds me of a discussion on IRC, about how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it. It's because of the Death Spirals and the Cult Attractor sequence that people bring the stupid "LW is a cult hur hur" meme, which would be great dramatic irony if you were reading a fictional version of the history of Less Wrong, since it's exactly what Eliezer was trying to combat by writing it. Does anyone else see this? Is anyone else bothered by:
Eliezer: Please, learn what turns good ideas into cults, and avoid it!
Barely-aware public: Huh, wah? Cults? Cults! Less Wrong is a cult!
&
Eliezer: Do not worship a hero! Do not trust!
Rationalwiki et al: LW is a personality cult around Eliezer because of so-and-so.
Really, am I the only one seeing the problem with this?
People thinking about this topic just seem to instantaneously fail basic sanity checks. I find it hard to believe that people even know what they're saying when they p...
LW doesn't do as much as I'd like to discourage people from falling into happy death spirals about LW-style rationality, like this. There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person, but he seems to be okay with that. That's the main reason why I feel LW is becoming more cultish.
How do you distinguish a happy death spiral from a happy life spiral? Wasting one's life on a wild goose chase from spending one's life on a noble cause?
"I take my beliefs seriously, you are falling into a happy death spiral, they are a cult."
I guess you meant to ask, "how do you distinguish ideas that lead to death spirals from ideas that lead to good things?" My answer is that you can't tell by looking only at the idea. Almost any idea can become a subject for a death spiral if you approach it the wrong way (the way Will_Newsome wants you to), or a nice research topic if you approach it right.
(the way Will_Newsome wants you to),
I've recanted; maybe I should say so somewhere. I think my post on the subject was sheer typical mind fallacy. People like Roko and XiXiDu are clearly damaged by the "take things seriously" meme, and what it means in my head is not what it means in the heads of various people who endorse the meme.
There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative. If I were Eliezer, I would run screaming in the other direction the moment I saw the first such person
You mean when he saw himself in the mirror? :)
Seriously, do you think sacrificing one's life to help build FAI is wrong (or not necessarily wrong but not an ethical imperative either), or is it just bad PR for LW/SI to be visibly associated with such people?
I think it's not an ethical imperative unless you're unusually altruistic.
Also I feel the whole FAI thing is a little questionable from a client relations point of view. Rationality education should be about helping people achieve their own goals. When we meet someone who is confused about their goals, or just young and impressionable, the right thing for us is not to take the opportunity and rewrite their goals while we're educating them.
I think rationality education should help people achieve the goals they're already trying to achieve, not the goals that the teacher wants them to achieve.
False dichotomy. Humans are not automatically strategic, we often act on urges, not goals, and even our explicitly conceptualized goals can be divorced from reality, perhaps more so than the urges. There are general purpose skills that have an impact on behavior (and explicit goals) by correcting errors in reasoning, not specifically aimed at aligning students' explicit goals with those of their teachers.
Rationality is hard to measure. If LW doesn't make many people more successful in mundane pursuits but makes many people subscribe to the goal of FAI, that's reason to suspect that LW is not really teaching rationality, but rather something else.
(My opinions on this issue seem to become more radical as I write them down. I wonder where I will end up!)
I didn't say anything about "rationality". Whether the lessons help is a separate question from whether they're aimed at correcting errors of reasoning or at shifting one's goals in a specific direction. The posts I linked also respond to the objection about people "giving lip service to altruism" but doing little in practice.
I don't think this calculation works out, actually. If you're purely selfish (don't care about others at all), and the question is whether to devote your whole life to developing FAI, then it's not enough to believe that the risk is high (say, 10%). You also need to believe that you can make a large impact. Most people probably wouldn't agree to surrender all their welfare just to reduce the risk to themselves from 10% to 9.99%, and realistically their sacrifice won't have much more impact than that, because it's hard to influence the whole world.
If developing AGI were an unequivocally good thing, as Eliezer used to think, then I guess he'd be happily developing AGI instead of trying to raise the rationality waterline. I don't know what Luke would do if there were no existential risks, but I don't think his current administrative work is very exciting for him. Here's a list of people who want to save the world and are already changing their life accordingly. Also there have been many LW posts by people who want to choose careers that maximize the probability of saving the world. Judge the proportion of empty talk however you want, but I think there are quite a few fanatics.
I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.
I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.
I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?
I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...
This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.
— Nick Tarleton's twist on T.S. Eliot
My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users. I agree LW rocks in general. I think we're mostly talking past each other; I don't see this discussion post as fitting into the genre of "serious LW criticism" as the other stuff you link to.
In other words, I'm talking about first impressions, not in-depth discussions.
I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme. That sounds pretty implausible to me. Keep in mind that no one who is fully familiar with LW is making this accusation (that I know of), but it does look like it might be a reaction that sometimes occurs in newcomers.
Let's keep in mind that LW being bad is a logically distinct proposition, and if it is bad, we want to know it (since we want to know what is true right?)
And if we can make optimizations to LW culture to broaden participation from intelligent people, that's also something we want to do, right? Although, on reflection, I'm not sure I see an opportunity for improvement where this is concerned, except maybe on the wiki (but I do think ...
My post was mostly about how to optimize appearances, with some side speculation on how our current appearances might be filtering potential users.
Okay.
If we want to win, it might not be enough to have a book length document explaining why we're not a cult. We might have to play the first impressions game as well.
I said stop talking about it and implied that maybe it shouldn't have been talked about so openly in the first place, and here you are talking about it.
I'd be curious where you got the idea that writing the cult sequence was what touched off the "LW cult" meme.
Where else could it have come from? Eliezer's extensive discussion of cultish behavior gets automatically pattern-matched into helpless cries of "LW is not a cult!" (even though that isn't what he's saying and isn't what he's trying to say), and this gets interpreted as, "LW is a cult." Seriously, any time you put two words together like that, people assume they're actually related.
Elsewise, the only thing I can think of is our similar demographics and a horribly mistaken impression that we all agree on everything (I don't know where this comes from).
Criticism rocks dude.
Okay. (I hope you didn't interpret anything I said as meaning otherwise.)
It's at least plausible that a lot of the people who can be good for SIAI would be put off more by professional marketing than by science fiction-flavored weirdness.
AAAAARRRGH! I am sick to death of this damned topic.
It looks a bit better if you consider the generalization in the intro to be mere padding around a post that is really about several specific changes that need to be made to the landing pages.
A rambling, cursing tirade against a polite discussion of things that might be wrong with the group (or perceptions of the group) doesn't improve my perception of the group. I have to say, I have a significant negative impression from Grognor's response here. In addition to the tone of his response, a few things that added to this negative impression were:
"how painstakingly and meticulously Eliezer idiot-proofed the sequences, and it didn't work because people still manage to be idiots about it"
Again, the name dropping of Our Glorious Leader Eliezer, long may He reign. (I'm joking here for emphasis.)
"LW is a cult hur hur"
People might not be thinking completely rationally, but this kind of characterization of people who have negative opinions of the group doesn't win you any friends.
"since it's exactly what Eliezer was trying to combat by writing it."
There's Eliezer again, highlighting his importance as the group's primary thought leader. This may be true, and probably is, but highlighting it all the time can lead people to think this is cultish.
The top autocompletes for "Less Wrong" are
These are my (logged-in) Google results for searching "Less Wrong_X" for each letter of the alphabet (some duplicates appear):
Google's autocomplete has a problem, which has produced controversy in other contexts: when people want to know whether X is trustworthy, the most informative search they can make is "X scam". Generally speaking, they'll find no results and that will be reassuring. Unfortunately, Google remembers those searches, and presents them later as suggestions - implying that there might be results behind the query. Once the "X scam" link starts showing up in the autocomplete, people who weren't really suspicious of X click on it, so it stays there.
Eliezer addressed this in part with his "Death Spiral" essay, but there are some features to LW/SI that are strongly correlated with cultishness, other than the ones that Eliezer mentioned such as fanaticism and following the leader:
Sorry if this seems over-the-top. I support SI. These points have been mentioned, but has anyone suggested how to deal with them? Simply ignoring the problem does not seem to be the solution; nor does loudly denying the charges; nor changing one's approach just for appearances.
Perhaps consider adding the high fraction of revenue that ultimately goes to paying staff wages to the list.
Oh yes, and fact that the leader wants to SAVE THE WORLD.
But they're not buying malaria nets, they're doing thought-work. Do you expect to see an invoice for TDT?
Quite appart from the standard complaint about how awful a metric that is.
Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
I only discovered LW about a week ago, and I got the "cult" impression strongly at first, but decided to stick around anyway because I am having fun talking to you guys, and am learning a lot. The cult impression faded once I carefully read articles and threads on here and realized that they really are rational, well argued concepts rather than blindly followed dogma. However, it takes time and effort to realize this, and I suspect that the initial appearance of a cult would turn many people off from putting out that time and effort.
For a newcomer expecting discussions about practical ways to overcome bias and think rationally, the focus on things like transhumanism and singularity development seem very weird- those appear to be pseudo-religous ideas with no obvious connection to rationality or daily life.
AI and transhumanism are very interesting, but are distinct concepts from rationality. I suggest moving singularity and AI specific articles to a different site, and removing the singularity institute and FHI links from the navigation bar.
There's also the pro...
Random nitpick: a substantial portion of LW disagrees with Eliezer on various issues. If you find yourself actually agreeing with everything he has ever said, then something is probably wrong.
Slightly less healthy for overall debate is that many people automatically attribute a toxic/weird meme to Eliezer whenever it is encountered on LW, even in instances where he has explicitly argued against it (such as utility maximization in the face of very small probabilities).
Upvoted for sounding a lot like the kinds of complaints I've heard people say about LW and SIAI.
There is a large barrier to entry here, and if we want to win more, we can't just blame people for not understanding the message. I've been discussing with a friend what is wrong with LW pedagogy (though he admits that it is certainly getting better). To paraphrase his three main arguments:
We often use nomenclature without necessary explanation for a general audience. Sure, we make generous use of hyperlinks, but without some effort to bridge the gap in the body of our text, we aren't exactly signalling openness or friendliness.
We have a tendency to preach to the converted. Or as the friend said:
It's that classic mistake of talking in a way where you're convincing or explaining something to yourself or the well-initiated instead of laying out the roadwork for foreigners.
He brought up an example for how material might be introduced to newly exposed folk.
If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.
The curse of knowledg...
If This American Life explained the financial crisis in an hour so that four million people improved on a written test on the subject, it's clear you can explain complicated material from near-scratch.
That's an inspiring goal, but it might be worth pointing out that the This American Life episode was extraordinary-- when I heard it, it seemed immediately obvious that this was the most impressively clear and efficient hour I'd heard in the course of a lot of years of listening to NPR.
I'm not saying it's so magical that it can't be equaled, I'm saying that it might be worth studying.
Here's what an outsider might see:
"doomsday beliefs" (something "bad" may happen eschatologically, and we must work to prevent this): check
a gospel (The Sequences): check
vigorous assertions of untestable claims (Everett interpretation): check
a charismatic leader extracting a living from his followers: check
is sometimes called a cult: check
This is enough to make up a lot of minds, regardless of any additional distinctions you may want to make, sadly.
I'm here for only a couple of months, and I didn't have any impression of cultishness. I saw only a circle of friends doing a thing together, and very enthusiastic about it.
What I also did see (and still do) is specific people just sometimes being slightly crazy, in a nice way. As in: Eliezer's threatment of MWI. Or way too serious fear of weird acausal dangers that fall out of currently best decision theories.
Note: this impression is not because of craziness of the ideas, but because of taking them too seriously too early. However, the relevant posts always have sane critical comments, heavily upvoted.
I'm slightly more alarmed by posts like How would you stop Moore's Law?. I mean, seriously thinking of AI dangers is good. Seriously considering nuking Intel's fabs in order to stop the dangers is... not good.
Speaking for myself, I know of at least four people who know of Less Wrong/SI but are not enthusiasts, possibly due to atmosphere issues.
An acquaintance of mine attends Less Wrong meetups and describes most of his friends as being Less Wrongers, but doesn't read Less Wrong and privately holds reservations about the entire singularity thing, saying that we can't hope to say much about the future more than 10 years in advance. He told me that one of his coworkers is also skeptical of the singularity.
A math student/coder I met at an entrepreneurship event told me Less Wrong had good ideas but was "too pretentious".
I was interviewing for an internship once, and the interviewer and I realized we had a mutual acquaintance who was a Less Wronger and SI donor. He asked me if I was part of that entire group, and I said yes. His attitude was a bit derisive.
Defending oneself from the cult accusation just makes it worse. Did you write a long excuse why you are not a cult? Well, that's exactly what a cult would do, isn't it?
To be accused is to be convicted, because the allegation is unfalsifiable.
Trying to explain something is drawing more attention to the topic, from which people will notice only the keywords. The more complex explanation you make, especially if it requires reading some of your articles, the worse it gets.
The best way to win is to avoid the topic.
Unfortunately, someone else can bring this topic and be persistent enough to make it visible. (Did it really happen on a sufficient scale, or are we just creating it by our own imagination?) Then, the best way is to make some short (not necessarily rational, but cached-thought convincing) answer and then avoid the topic. For example: "So, what exactly is that evil thing people on LW did? Downvote someone's forum post? Seriously, guys, you need to get some life."
And now, everybody stop worrying and get some life. ;-)
It could also help to make the site seem a bit less serious. For example put more emphasis on the instrumental rationality on the front page. People discu...
People discussing best diet habits don't seem like a doomsday cult, right?
I'm having trouble thinking up examples of cults, real or fictional, that don't take an interest in what their members eat and drink.
Some things that might be problematic:
We use the latest insights from cognitive science, social psychology, probability theory, and decision theory to improve our understanding of how the world works and what we can do to achieve our goals.
I don't think we actually do that. Insights, sure, but latest insights? Also, it's mostly cognitive science and social psychology. The insights from probability and decision theory are more in the style of the simple math of everything.
Want to know if your doctor's diagnosis is correct? It helps to understand Bayes' Theorem.
This might sound weird to someone who hasn't already read the classic example about doctors not being able to calculate conditional probabilities. Like we believe Bayes theorem magically grants us medical knowledge or something.
[the link to rationality boot-camp]
I'm not a native speaker of English so I can't really tell, but I recall people complaining that the name 'boot-camp' is super creepy.
On the about page:
Introduce yourself to the community here.
That's not cultish-sounding but it's unnecessarily imperative. Introduction thread is optional.
Disclaimer: My partner and I casually refer to LW meetups (which I attend and she does not) as "the cult".
That said, if someone asked me if LW (or SIAI) "was a cult", I think my ideal response might be something like this:
"No, it's not; at least not in the sense I think you mean. What's bad about cults is not that they're weird. It's that they motivate people to do bad things, like lock kids in chain-lockers, shun their friends and families, or kill themselves). The badness of being a cult is not being weird; it's doing harmful things — and, secondarily, in coming up with excuses for why the cult gets to do those harmful things. Less Wrong is weird, but not harmful, so I don't think it is a cult in the sense you mean — at least not at the moment.
"That said, we do recognize that "every cause wants to be a cult", that human group behavior does sometimes tend toward cultish, and that just because a group says 'Rationality' on the label does not mean it contains good thinking. Hoping that we're special and that the normal rules of human behavior don't apply to us, would be a really bad idea. It seems that staying self-critical, understanding how ...
What's bad about cults is not that they're weird. It's that they motivate people to do bad things...
People use "weird" as a heuristic for danger, and personally I don't blame them, because they have good Bayesian reasons for it. Breaking a social norm X is positively correlated with breaking a social norm Y, and the correlation is strong enough for most people to notice.
The right thing to do is to show enough social skill to avoid triggering the weirdness alarm signal. (Just like publishing in serious media is the right thing to avoid the "pseudoscience" label.) You cannot expect that outsiders will do an exception for LW, suspend their heuristics and explore the website deeply; that would be asking them to privilege a hypothesis.
If something is "weird", we should try to make it less weird. No excuses.
Often by the time a cult starts doing harmful things, its members have made both real and emotional investments that turn out to be nothing but sunk costs. To avoid ever getting into such a situation, people come up with a lot of ways to attempt to identify cults based on nothing more than the non-harmful, best-foot-forward appearance that cults first try to project. If you see a group using "love bombing", for instance, the wise response is to be wary - not because making people feel love and self-esteem is inherently a bad thing, but because it's so easily and commonly twisted toward ulterior motives.
Did anyone reading this initially get the impression that Less Wrong was cultish when they first discovered it?
What do you mean, "initially" ? I am still getting that impression ! For example, just count the number of times Eliezer (who appears to only have a single name, like Prince or Jesus) is mentioned in the other comments on this post. And he's usually mentioned in the context of, "As Eliezer says...", as though the mere fact that it is Eliezer who says these things was enough.
The obvious counter-argument to the above is, "I like the things Eliezer says because they make sense, not because I worship him personally", but... well... that's what one would expect a cultist to say, no ?
Less Wrongers also seem to have their own vocabulary ("taboo that term or risk becoming mind-killed, which would be un-Bayesian"). We spend a lot of time worrying about doomsday events that most people would consider science-fictional (at best). We also cultivate a vaguely menacing air of superiority, as we talk about uplifting the ignorant masses by spreading our doctrine of rationality. As far as warning signs go, we've got it covered...
Specialized terminology is really irritating to me personally, and off-putting to most new visitors I would think. If you talk to any Objectivists or other cliques with their own internal vocabulary, it can be very bothersome. It also creates a sense that the group is insulated from the rest of the world, which adds to the perception of cultishness.
Agreed. I realize that the words like "litany" and "conspiracy" are used semi-ironically, but a newcomer to the site might not.
I have several questions related to this:
If you visit any Less Wrong page for the first time in a cookies-free browsing mode, you'll see this message for new users:
Here are the worst violators I see on that about page:
And on the sequences page:
This seems obviously false to me.
These may not seem like cultish statements to you, but keep in mind that you are one of the ones who decided to stick around. The typical mind fallacy may be at work. Clearly there is some population that thinks Less Wrong seems cultish, as evidenced by Google's autocomplete, and these look like good candidates for things that makes them think this.
We can fix this stuff easily, since they're both wiki pages, but I thought they were examples worth discussing.
In general, I think we could stand more community effort being put into improving our about page, which you can do now here. It's not that visible to veteran users, but it is very visible to newcomers. Note that it looks as though you'll have to click the little "Force reload from wiki" button on the about page itself for your changes to be published.