Spend ~5 minutes on the r/exmormon or r/exmuslim subreddits, and you'll find most people leaving those religions consider them cults. Spend ~5 hours in a Mormon church or Muslim mosque, and most people not born into those religions will consider them cults. My guess is:
If your argument is, "from an inside perspective, Rationalism doesn't seem like a cult," then you're proving way too much. From an inside perspective, every Mormon believes they are not part of a cult:
I think you could argue that Mormonism fits maybe one or two of these cult warning signs, but I'm pretty sure every group hits one or two. Heck, your company probably hits more than Mormonism.
I believe that there are four main strategies groups use to respond to accusations of being a cult: defense, offense, assimilation, and acceptance. This article is representative of the typical rationalist strategy of mounting an intellectual defense. Religious and ethnic groups tend to assimilate or play offense. Nonthreatening but unusual hobbyist, spiritual, or community groups often jokingly pretend to accept, or at least entertain, the accusation.
It seems to me that a defensive strategy is often what genuinely threatening groups often use to justify their existence. They may claim that technically, the group doesn't fit the characteristics of a cult, or that technically some other group that is an accepted part of society does fit the characteristics of a cult, and therefore that society, by its own standards, must accept the group. But this is false! Society doesn't have to use this standard, and often doesn't. It comes across as trying to undermine the often unspoken norms of the society, make them legible so the group can manipulate them in order to achieve its ends. That feels threatening. Multi-level marketing schemes try to frame themselves as technically legitimate businesses. Right-wing extremists try to frame themselves as legitimate well-organized militias. White nationalists and neo-Nazis try to frame themselves as ordinary ethnic and cultural groups. And rationalists try to frame themselves as ordinary community associations. This pattern-matching, I claim, is part of why rationalists are so vulnerable to cult accusations.
Compounding this is what rationalists do take offense at. Rationalists don't take offense at being called cults. They did take offense at having the real name of one of their community leaders revealed. That is very much the sort of thing a secret society or cult might do. If rationalists took loud offense at being insulted by "cult" accusations while also making moves to show up in wider society as well-assimilated, upstanding members of it, then over time, that would tamp down on the cult accusations. But rationalists very much don't seem to be assimilating. They incorporate seemingly every weird strain of Silicon Valley culture and don't seem to do almost anything the "normal way," from dress, to food, to housing, to relationships, to art, to conversation, to work, to politics.
So no assimilation, not taking offense at being called cults, not accepting the cult accusation lightheartedly, just mounting intellectual defenses against meeting some self-chosen technical criteria of what it means to be a "cult." Sounds like the kind of thing a cult might do, in order to attract more of the kind of people who buy into this way of thinking and isolate them from the rest of society through an indoctrination into their weird culture!
Having some affinity for rationalism, I think I understand why rationalists do this. They've had a history of, inconsistently, rejecting what they call the Dark Arts and "asymmetric weapons," and trying to stick with a facts-and-logic approach to every debate. Focusing on a strategy based on intellectual defense is in line with that. It suits the self-chosen morality of rationalists. But sticking to that so consistently is, again, a refusal to assimilate. So I think that as long as rationalists persist in this strategy, so long will they be perceived as a cult.
This sounds like damned if you do, damned if you don't. If someone accuses you of being a cult, saying that you are not is exactly what a cult would do. But it's not like saying nothing will clear the accusation either.
Actually, saying that you are not a cult is not what an actual cult would do. An actual cult would attack the accuser (like physically, not like "tweet a disagreement"), and redirect the debate to "freedom of religion".
Is anyone defending the rationality community by some kind of argument that pattern-matches "we are the good/holy guys, therefore criticizing us for anything (whether truthfully or not) is inherently evil"? I am not aware of anything like this. Are people taking personal revenge against David Gerard or Cade Metz (other than complaining online about their behavior)? Again, I am unaware of that.
Less importantly, I don't believe that "accepting the cult accusation lightheartedly" would work. I am not even sure it ever worked for anyone. Do you have some good examples?
I articulate a fake framework for thinking about strategies by which groups can fight for status as a legitimate culture-in-good-standing. This fight is a long-term struggle, not a guaranteed victory or defeat. Groups are always damned by many, but can aim to be damned by fewer over the long run by being organized and strategic about how they respond to criticism.
Core to this strategy is taking offense at being called a cult while assimilating into the broader culture in ways that do not sacrifice your core values but are crucial issues for others.
For example, to assimilate, Mormons dropped polygamy and repudiated violent behavior of people who claim to be Mormons in the strongest terms, now frame themselves as Christians or the Church of Latter-day Saints (or LDS) rather than the more loaded “Mormon.” In fact, the Mormon at my PhD program just refers to “his church” without even mentioning LDS. They also take offense with public media for misrepresenting their faith or their current stance on the problems in the religion’s past, which they acknowledge and condemn. They don’t try to explain why they’re different from a cult according to a list of abstract criteria, the way OP is doing here.
Being lighthearted is not a suitable strategy for a group that is being subjected to serious sustained cult accusations. It’s a strategy for small odd and harmless groups to acknowledge their difference from the broader culture, creating space for a person who feels a momentary question or discomfort to bring it up, and then move on. It’s what to do for small new-age meditative communities, the Odd Fellows, etc.
If rationalists wanted to adopt a Mormon-like strategy, they’d need to define what are the core values of a rationalist. They’d need a formal leadership that can interact with media and pre-emptivey sanction subgroups and individuals who give rationalism a bad name. They'd need to make visible demonstrations of assimilation. And they’d need to demand respect and take offense at disrespect, insisting on being viewed as a positive cultural and community association. That’s very far from rationalist trends, and I don’t expect it will happen. But I do think it’s the strategy that every long-lived religion and major cultural group lands on.
Core to this strategy is taking offense at being called a cult while assimilating into the broader culture in ways that do not sacrifice your core values but are crucial issues for others.
I like this summary!
Just a quick idea, I think a nice way to contact the outside culture without compromising on anything could be to organize public lectures or workshops. Kinda like the existing workshops for the mathematically gifted kids or wannabe bloggers, only they should be short (like, one afternoon) and instead of inviting people to our "walled compound" they should be on a neutral territory that feels safe (maybe even offer to organize the workshop in a local school). Possible topics: mathematics, statistics, computer science, learning in general, critical thinking, blogging. Too bad I am not in America, I would quite enjoy doing something like that, and I am not doing anything important that this would distract me from. This could help create public perception of our community as "harmless nerds".
Ironically, one thing that might help would be to somehow make the membership explicit. Without explicit membership, you cannot exclude people (such as Zizians or SBF), so people can argue that they belong to us, and there is no way to prove the opposite. Mormons can say "this is not one of us" when he is not, and they can kick someone problematic out when he is. Or maybe rationality is too nebulous word, so we could instead talk about e.g. "Less Wrong community membership". It's like the orthogonality thesis: whether someone is good at Bayesian updating, and whether someone is a decent person, are two independent things -- we should try to find the people in the intersection.
I wonder (but this would be a longer debate) if we could have some kind of "web of trust", where individual rationalists could specify how much they trust someone to be a nice and reasonable person; the system would calculate the score in some way, and you need to exceed some threshold to be accepted. If you turn out to be a scammer or a serial killer, everyone who vouched for you would be punished in some way (lose their right to vouch for someone, get a penalty on their own score). No idea how specifically should the math work here.
These are good ideas. I like the idea of offering tutoring or classes as a way to engage a broader community. I also think having formal orgs that interface with media and have official leaders who speak on behalf of their membership seems like a good idea. However, to work, I think these orgs are going to have to officially put the brakes on some of the divergent lifestyle choices of membership and on some of the more radical statements by rationalist figures, and it may not be compatible with the culture of rationalists to submit to constraining, assimilative norms in that way.
The web of trust is also something I’ve wanted for the world of science. The way I picture it is that you need a way to subscribe to other people or organizations whose judgments you trust. Each participant can privately rate their trust level in other participants. The trust level they observe reflects the aggregate trust levels of the participants they subscribe to. Would love to see such a technology.
They may claim that technically, the group doesn't fit the characteristics of a cult, or that technically some other group that is an accepted part of society does fit the characteristics of a cult, and therefore that society, by its own standards, must accept the group. But this is false! Society doesn't have to use this standard, and often doesn't. It comes across as trying to undermine the often unspoken norms of the society, make them legible so the group can manipulate them in order to achieve its ends. That feels threatening.
I think this is both an important point and proves too much. I'm interested in a discussion harms-of-cults or (not of cults) rationalism might be producing, but I can't get invested in the opinion of a guy who thinks a three story building is inherently ominous.
That’s an entirely valid interest. One point I want to make is that when people spin a narrative accusing a group of being a cult, the points of evidence they raise are often post-hoc rationalizations rather than true reports on how they concluded the group was a cult. In this case, I doubt the guy truly finds the building is ominous, any more than I find pictures of the art in Jeffrey Epstein’s mansion ominous. They take on an ominous feeing because of prior assumptions about the group that are being mapped on to things associated with it.
This matters because it’s important to accurately distinguish the real evidence someone’s drawing on to form their conclusion that a group is a cult from the narrative they’re spinning to make an accusation or insult. In this case, the quip about the building is the insult, not the evidence. It’s still fine to not take the insult seriously. But usually we evaluate people’s evidence and reasoning process to decide whether to take them seriously. In this case, we’d want to know how the guy truly first came to the conclusion that rationalism was a cult. Maybe he heard a bunch of stories about the Zizians and was exposed to some salacious Metz quotes of Scott’s writing and Tyler Cowen’s quotes of Eliezer. Maybe he heard from his friend they’re a cult and saw some weirdly dressed people going to a party and heard about Aella doing sex work to pay for IVF. Or maybe he applied the cult criteria differently than OP here and decided the rationalists fit the bill. Some reasons may be good, others bad, but in any case, this is where I’d look to decide whether to take him seriously. If I just wanted to defend rationalism I’d say “this guy is a bigot, a rationalist-phobe, and a hypocrite. He should be ashamed of himself. Would he talk this way about a religious community or an ethnic group or a hobby community? No? Then what makes it OK to talk about us this way? It’s unacceptable.” Etc.
This feels like a bit of a straw man. Rationalism isn’t a cult, but it spawns cults- and they tend to make the evening news. I think the cause and effect is a bit backwards- my pet opinion is that rationalism is a particularly effective tool for escaping cults, and so rationalist spaces are full of ex-cult members, who despite their newfound anti-cult tools, tend to relapse- but the rationalism cult correlation is there
my usual comment to people who say rationalism is a cult is to say, nah it's a secular "religion"[1] that has spawned cults. several of them. it's a problem. but comparing on the merits of leaveability and relationship to people who stopped chilling with rat folks, and lack of central command structure, vigorous and constant pushback against anyone who looks leader-ish, it's less severe than some groups I've encountered. definitely not zero worrying features. definitely concerning how obsessed rationalists are with proving it's neither a cult nor a religion. maybe if the ones who are bothered about that could take a sober look at how it's spawned cults they'd be a bit less insufferable about it.
a point I most often make to people, rationalists or randos online, who are giggling about starting a cult: nah cult abuses are always bad and they're easy to start doing. be aware of what they are and put effort into undoing them; the things you think you like about cults, spirituality and religiousness and friend groups or whatever, fine do those, humans like those, but don't do the things that make something a cult rather than a new age religion. make real damn sure you don't replace your ability to "touch grass" and see reality and shit like that, don't start following a leader without exception handlers for if that leader tells you to do dangerous stuff. if they try to isolate you or convince you to do dangerous things without thinking them through, or with thinking them through but they are the authority on what's true about a large part of the thing, that's a really bad sign and you should get some space with people who don't agree with the authority, and are socially far away, to at least see what they think. of course sometimes they'll be in a conflicting memeplex, that's the point, but it also means don't take their word as law either. you have to actually, you know, do your own thing.
sort of, still less totalizing than the religion I grew up with in terms of behavior prescriptions
(Agreed)
A good definition of religion is Harari's "an idea that supports social harmony", it explains why religions often become monotheistic/totalizing; new ideas that promote new kinds of social order may threaten an existing social order, it also explains why a religion doesn't need to become totalizing; many organising principles are orthogonal or symbiotic, for instance, bayesianism partly resolves the long perceived tension between rationalism-classic (the principle that people should sometimes action novel arguments) and empiricism (the principle that people should go out and check things before acting) (it does this by explaining how to appraise complicated arguments about the balance of evidence), it (via FDT) also supports a synthesis of consequentialist individualism and kantianism (explaining that it can be egoistically rational to be the kind of person who would conform with moral policies even in situations where the immediate consequences don't benefit them), to some extent it even rescues healthy forms of woo by telling us when it's okay (or not) to follow the advice of heuristics or approximations.
Although I'd say bayratism is only a 7 on the religiousity scale due to its consequentialist origins underpreparing it to build the kind of accountability mechanisms that could ever make a religious community actually support coordination/being more moral. It loves its illegible contrarians too much. I'm not sure what to do about that.
With all due respect, this post reads as, “The lady doth protest too much.” It looks like you’ve dedicated this post to absolutely, comprehensively dismantling any notion that your community has anything to do with cults — with a religious fervour, almost. :)
But do you think there’s any merit to the argument that your community has some cult-like aspects to it?
I suspect that it would have been easier for me to take your post in good faith, if you’d been, like, “Yeah. We definitely have some problems and norms that are worrisome. We should probably examine those things. I can see how someone would make that connection, even if I don’t think that we’re a full-blown cult.”
It’s not like Cade is the only person who’s ever critiqued the rationalists on this subject. I’ve spent some time in Berkeley and I’ve certainly thought about how some of the stuff is cult-y; and I’ve talked to other people about it, who agree.
(I made my account to leave this comment, but I’m engaging in good faith; I’m not a troll.)
I agree that we should discuss the specific problems.
I would even approve of writing some kind of resource on "bad things that happened in the rationalist community". It should be mandatory reading for new members.
On the other hand, the fact that your comment is upvoted here... is not something you would expect from a typical cult, would you?
On the other hand, the fact that your comment is upvoted here... is not something you would expect from a typical cult, would you?
maybe not as often, but could still be "Ooh, someone who doesn't like the cult! better show them we're Normal!", not to mention "hey, look, we're not a cult, see? we upvoted you!" Traditional hardcore cults would definitely have difficulty doing it, but medium-strength cults like the religion I grew up with explicitly tell evangelists to listen to people's doubts, while never holding any doubts oneself. there are still ways for permitting doubt to be turned into a tool of control.
There are usually not simple and reliable ways to tell that something isn't bad. they're either simple or reliable. Usually people err on the side of simple and classify some things as bad that aren't necessarily but which are hard to confirm as being not bad, so as to not risk get screwed over by mistakes in complex is-it-bad-or-not rules. Eg, "don't talk to strangers on the street unsupervised" is common (generally good!) advice for kids, and it excludes talking to the majority of actually pretty nice people so as to not get screwed over by less-but-still-plenty-common dangerous people.
It should be mandatory reading for new members.
I don't think having explicit membership that makes a mandatory reading moment possible is itself a good idea.
I do think a community having a warnings doc is good and that people who encounter the community can inform themselves about what kind of danger or annoyance or etc the community is likely to produce is familiar and avoidable. Though if your list of warnings is incomplete, it can produce a false sense of security in readers. If it's complete and accurate, it can become a target of attacks by the people it's trying to contain, which they sometimes succeed at, at least partially. And if it contains mistakes, people can even have legitimate issues with it. Eg, malicious folk love to try to get in a position where they can add people to a warnings list like that, so it can even become a tool of control itself.
Still seems like a good idea, that's why I comment places like this to point out things are not fine, but requires having a good sense of the landscape of attacks.
If you’re standing outside a walled compound, it can be hard to tell whether the people inside are part of a wholesome community or a cult.
It's probably also worth mentioning that Lighthaven is a conference venue, not a place people typically live long-term. If we all lived there it would definitely be a step or two up on the cultiness scale.
(I realize I'm preaching to the choir by posting this here. But I figure it's good to post it regardless.)
Recently, Scott Alexander gave a list of tight-knit communities with strong values:
- The Amish: They live apart in tight-knit communities with strong countercultural values, and carefully control their technological and ideological environment. 10/10.
- Cults and communes: Any cult mature enough to have its own compound, or any communal living project, has succeeded almost as thoroughly as the Amish. We may not support their insane religious beliefs, or the various sex crimes they are no doubt committing, but they have succeeded at Fukuyama’s suggestion of knitting themselves a new god within the liberal order. 9.5/10.
- Ultra-Orthodox Jews and Mormons: Get lots of people of the same religion together in one place - a timeless classic. Some of the ultra-est of the ultra-Orthodox are still more fluent in Yiddish than English, giving them near-invincibility from the mainstream. 9/10.
- The Free State Project: some libertarians made a deal that if enough other libertarians agreed, they would all move to New Hampshire and try to turn it into a libertarian paradise. They got about 20,000 people on board; the results ranged from building entirely new libertarian towns in the forest, to buying homes in Portsmouth or Manchester and keeping in touch with their libertarian friends. 7/10.
- Serious Christianity: Lots of Christians have social circles centered around their church, send their children to Christian schools, have Christian therapists they can visit if they feel down, and consume Christian media. On the other hand, they usually work a secular job, and most of their neighbors are secular. 6/10.
- The LGBTQ community: don’t laugh at this one. If you know many of these people, you know they have their own parallel society of LGBT friends, LGBT bars, and LGBT dating sites. They attend LGBT parties, conform to LGBT fashions, and watch LGBT sports (like roller derby). They live in special LGBT-friendly neighborhoods, and everyone around them follows LGBT-friendly norms. They even have their own flag, an obvious first step for people trying to form a country-within-a-country. 5/10.
- The rationalists: I live on a street with five other rationalist families and a small rationalist microschool. The broader Bay Area rationalist community has its own parties, dating sites, media, holidays, a conference center, and even a choir. 5/10
I would probably add the Jewish community writ large to this list. Most Jews, while not as strict or communitarian as the Ultra-Orthodox, have their own traditions, holidays, newspapers, and socialize in synagogues or other Jewish groups. In fact, most ethnic communities in America fall into this category. Chinese Americans, to give just one example, also have their own holidays, social groups, community centers, Chinatowns, and Chinese-language newspapers. Oftentimes religious organizations in America draw most of their membership from a particular racial / ethnic community (e.g. black churches, Korean churches, or Indian Hindu temples).
Also, political groups might fall under this category. People on the political left might choose to only associate with other leftists, only live in blue areas or blue states, and only get their news from left-leaning political commentators. Leftists have their own value system and their own leaders, and these are distinct from the values and leaders that are respected by the rest of society. The same applies for the political right.
I bring this up to make the point that there’s no clear line between cults and normal, wholesome community groups. When, exactly, does an insular community go from quirky to culty? When, exactly, does a church go from normal religious group to cult? Few people would argue that we need to get eliminate the Chinese, Jewish, Mormon, or LGBTQ communities on account of their groupish natures. We, as a society, generally think it’s fine to have close-knit communities with strong values, even strong countercultural values, so long as those communities are not actively harmful to the rest of society.
Unfortunately, this standard is not universally applied, and people often use the label “cult” as a way to insult their political opponents. Take, for instance, this recent article written by Cade Metz of the New York Times about the Rationalist community.
For context: Metz has had a years-long adversarial relationship with the Rationalist community ever since he doxxed and wrote a hit piece about the above-quoted Scott Alexander, who is a famous and well-respected Rationalist writer. In this recent article, Metz describes the Rationalists and their activities, while not-so-subtly implying that the Rationalists have a secretive and nefarious agenda. Metz also favorably quotes a chaplain named Greg Epstein, who argues that the Rationalist / Effective Altruist (EA) community is a cult.
Needless to say, I disagree with this accusation. Yes, we’re a close-knit community with strange habits, but so are the Mormons, the Muslims, and the Marxists. Unless you’re willing to say that every Mormon, Muslim, or Marxist is part of a cult, then it’s unfair to say that about the Rationalists. If anything, I’d say there’s a stronger argument for Mormons, Muslims, and Marxists being cultists than Rationalists.
To be fair, that’s not immediately obvious to an outsider. If you’re standing outside a walled compound, it can be hard to tell whether the people inside are part of a wholesome community or a cult. So in the spirit of transparency and responding to accusations in good faith, I’ll give my reasons for believing that the Rationalist community is not a cult. In order to do so, I’ll use this list of 10 warning signs of cults. (By the way, I’m not cherry-picking this list in order to support my argument. This list was literally the first result on Google when I searched “warning signs of a cult”.)
As I’m going through this list, I want you to beware isolated demands for rigor. It’s easy to point to some bad incident in the Rationalist community and claim that it proves the entire group is evil or cultish. But in a community with thousands of people that has lasted for around 20 years, it is inevitable that some bad incidents will occur. The relevant question to ask is not, “Can I find something wrong with the Rationalist community?” Rather, the relevant question is, “Is the Rationalist community worse than other communities I care about?” Whatever scrutiny you wish to apply to the Rationalist community, you should seriously ask yourself if your own communities can survive that same level of scrutiny.
The simplest way to refute the claim that the Rationalist community is authoritarian is to ask, “Where’s the authority?” There is no Rationalist leadership team that members are forced to pledge allegiance to, and there is no central Rationalist organization with rules that members are forced to follow.
Yes, there are leaders in the Rationalist community, in the sense that there are individuals who are popular and well-respected in that community. But the same can be said of any community. Yes, there are norms you have to follow in certain Rationalist spaces (e.g. the norm that you don’t lie to people or act in bad faith). But once again, that can be said of literally any community. Rationalist leaders face the same, if not greater level of accountability as the leaders of other major movements and organizations.
Definitely not true! Rationalists probably ask way more questions than the general population, and we both give and receive more criticism than the general population. In fact, one of the defining features of the Rationalist movement is our penchant for heated debates and our commitment to near-absolute free speech.
If anything, we are unusually prone to criticizing ourselves. The Effective Altruism Forum has an entire section highlighting criticism of Effective Altruism. They even put on a “Criticism and Red Teaming Contest”, where they solicited people to argue against the movement and awarded money to best critics. Can you imagine any other advocacy organization doing this? Can you imagine, for instance, a reproductive rights organization hosting an “Abortion Criticism Contest” and giving money to most persuasive pro-life advocates?
Once again, this is an area where the EA / Rationalist community is actually unusually good compared to other communities. One of the original ideas of Effective Altruism is that charities need to be more transparent about their finances and their results. And we try as hard as possible to live up to that standard! GiveWell — one of the largest EA charities — is probably the gold standard of transparency in charities. All of their official records, dating back to their founding in 2007, are easily accessible from their website, and they even have a full page detailing mistakes they’ve made over the years.
EA / Rationalist organizations are subject to the same laws as any other organizations, and their finances can be reviewed via the same legal processes. In many cases, as mentioned above, Rationalist organizations are more transparent than they are legally required to be.
And before you mention Sam Bankman-Fried, I want to remind you that he committed crimes as a private individual, not in collaboration with the broader EA / Rationalist community. There is no evidence that EA leaders knew about his crimes before the general public did. So while some EA / Rationalist organizations were funded using ill-gotten FTX money, this was not known to any of the recipients at the time. SBF’s crimes were immediately and forcefully denounced by EA leaders as soon as they were uncovered, and Lightcone Infrastructure (the organization that runs the Lighthaven campus in Berkeley) even returned significant portions of the money that they had been given by SBF.
Any movement can contain a criminal. Indeed, once a movement gets large enough, it’s almost inevitable that at least one of its members will be a criminal. The important question to ask is, “How well does the community respond to a criminal’s crimes, once they are discovered?” And in the case of SBF, I think the Rationalist / EA community responded about as well as they could have.
This is just not something that happens in the Rationalist community. We don’t believe we’re being persecuted, and we don’t believe there’s some evil conspiracy against us. We are sometimes wary of journalists writing hit pieces about us. But this is a reasonable concern, since journalists have a habit of writing hit pieces about us! We believe that we are sometimes unfairly attacked, but we respond to those attacks without resorting to conspiracy theories or developing a persecution complex. As you can hopefully see by this article, I respond to criticism in a reasonable way without resorting to personal attacks against my critics.
This is the exact opposite of what the Rationalist community is like. There is an entire sub-community of people calling themselves “Post-Rationalists”, people who were once Rationalists but who now believe the Rationalist project is somehow misguided. The Rationalists do not shun these people or try to censor their ideas. To the contrary: Rationalists and Post-Rationalists typically get along great, and Post-Rationalists frequently retain positive relationships with Rationalists even after leaving the movement.
Similarly, there are sub-communities of self-styled “Rat-adjacents” or “EA-adjacents” — people who share some of the beliefs of the Rationalist / EA communities, but who choose not to get fully on board. Do we shun these people or try to purity-test them? No!
Whether a person is post-Rationalist, Rationalist-adjacent, or even just non-Rationalist, we accept them. There are legitimate reasons to leave the movement, and we don’t blame anybody for doing so.1
Can I claim that nobody in the Rationalist community has ever been mistreated or abused? No. In any large community, something bad is bound to happen to somebody. There have, unfortunately, been incidents of sexual harassment within the Rationalist community. These incidents should be, and are, forcefully condemned by the community.
Once again, I urge you to apply the same standard toward the Rationalist community that you would apply toward any other community. Does abuse / mistreatment happen more often in the Rationalist / EA community than anywhere else? Is there a pattern of abuse, rather than isolated incidents? Are there cover-ups of this abuse? Are victims shamed or ostracized by the community when they speak out? I can confidently answer “No” to each of these questions.
Again, not something that happens in the Rationalist community. As stated above, we do not believe we have been “abused” by our critics, nor do we spend much time ruminating on this “abuse”.
The only time I can recall something even close to this happening is in the aftermath of the aforementioned Scott Alexander doxxing controversy, when there was a good deal of writing about how unfairly Scott had been treated. But I think this was a reasonable reaction to the situation. If your personal identity gets revealed on the pages of the New York Times, you have the right to be angry about that!
This is the one point out of ten that does apply to the Rationalist / EA communities. I know several Rationalists who feel guilty for not being rational enough. And I know several EAs who feel guilty for not being effective or altruistic enough. Hell, I sometimes feel guilty about this.
There is an implicit demand inherent in Effective Altruism, which is: You must be the most effective version of yourself that you possibly can be, and you must use this power to help as many sentient beings as possible. To be clear, EA leaders do not explicitly endorse that demand, but many EAs nonetheless internalize it. And like any strict moral code, it is impossible to live by fully. We are mere mortals, and we are not perfect. Even the best of us will sometimes make mistakes, believe falsehoods, or fail to live up to our potential. I will never be good enough, but I will never stop trying.
So fine. Score one for “Rationalism is a cult”.
Definitely not true! As mentioned above, there is no one Rationalist leader to pledge allegiance to, and even if there was, we wouldn’t do it. Probably the biggest name in Rationalism is Eliezer Yudkowsky, one of the originators of the movement as well as one of its most prolific writers. We do not believe Yudkowsky is right at all times. I believe Yudkowsky is wrong about many things. In fact, I have yet to meet a Rationalist who doesn’t have some bone to pick with him.
The same can be said of other prominent Rationalists, like Scott Alexander or Robin Hanson. We certainly respect these people, and we often defer to them on areas where we think they’re more knowledgeable than us. But we do not hold it as an article of faith that you have to agree with these people. People can, and frequently do, disagree with leading Rationalists while remaining Rationalists in good standing. If anything, one of the fastest ways to gain status within the Rationalist community is to publish an effective critique of a popular Rationalist writer, since it proves that you’re capable of “playing ball” with the best.
See Point 9.
As you can hopefully see by now, the Rationalist community does not exhibit the disturbing patterns that cults typically exhibit. We do not try to isolate our members from outside influences, and we do not demand allegiance to any ideology or leader. There is no way to credibly accuse the Rationalist community of being a cult without also impugning other well-regarded communities as cults as well.
The reason I’m writing this article is not only because I want to defend my community against unfair accusations, but because the standard set by this New York Times article, if applied more broadly, would damage society as a whole. It’s good for people to form tight-knit communities with countercultural values. Some of the greatest friendships and romantic relationships come from communities with strong, abnormal beliefs and habits. By contrast, forcing people to conform to a mainstream society that they dislike is bad for them, since it will cause them to become alienated and resentful. And it is be bad for society as a whole, since alienated, resentful individuals are more likely to lead dysfunctional lives and join extremist political movements.
Speaking for myself, I became much happier after joining the Rationalist / EA community, and I have formed some of my greatest friendships and professional connections through that community. Rationalism definitely isn’t for everyone, but I think that everyone should have some version of it — some version of a tight-knit community with strong values. (See here for advice on how to find / build your own.) As a society, we should allow and even encourage the formation of these communities, rather than attacking the ones that exist by calling them cults.
There are real cults in the world, and they should be criticized. But demonizing innocent communities as cults helps nobody, and only worsens our society’s already-terrible loneliness epidemic. So no, I’m not part of a cult. If you want to accuse Rationalists of being cultists, you’ll have to find a better argument than, “Look at all the weird things they do!”