933

LESSWRONG
LW

932
Personal Blog

-14

Yes, Rationalism is a Cult

by James Camacho
7th Aug 2025
5 min read
23

-14

Personal Blog

-14

Yes, Rationalism is a Cult
18Brendan Long
2James Camacho
5Larks
17Seth Herd
-3James Camacho
15Archimedes
1James Camacho
2Richard_Kennaway
1M. Y. Zuo
3Richard_Kennaway
3AnthonyC
7Karl Krueger
6frontier64
4Richard_Kennaway
4TAG
1SpectrumDT
1James Camacho
1SpectrumDT
1James Camacho
1SpectrumDT
-3James Camacho
3SpectrumDT
1Austin Long
New Comment
23 comments, sorted by
top scoring
Click to highlight new comments since: Today at 4:07 PM
[-]Brendan Long2mo*1813

I dunno, under this definition isn't pretty much every group a cult? A group that gives outsiders as much influence as insiders isn't really a group at all, and a lot of groups pick their most impactful arguments even if they're somewhat sketchy if you look too closely[1]. Earlier this week I was at a culty Safeway that didn't have any space for competitors to advertise, and which falsely claimed to "Always have the best deals*".

  1. ^

    I'm not saying we should do this, and pushing back on arguments you think are bad is a good thing, but the drowning child thought experiment doesn't seem particularly culty.

Reply
[-]James Camacho2mo20

Yes, pretty much every (large) group has cult adaptations. Otherwise it would have been outcompeted by a similar group with a similar ideology, that hides its inconsistencies a little better. I think there's two important differences between Rationalism and Safeway that make the former culty in a worse way:

  1. The degree of impact it has on people's lives. People will switch careers, or donate percentages of their wealth for Rationalism, but maybe waste $100 on a Safeway membership card.

  2. You get a lot more pushback from a Rationalist than a Safeway employee when you point out inconsistencies. If you go around an EA conference saying, "I don't think it's worthwhile to work on AI safety research, because most of the research groups I've seen haven't put out useful work," every attendee will find a reason you should work on it anyway, and probably switch fields to try to join one of those research groups. If you go around a Safeway store telling the employees, "I don't think you have the best deals, Whole Foods seems to have cheaper eggs," they'll probably shrug their shoulders and say, "I guess so."

Reply
[-]Larks2mo53

If you go around an EA conference saying, "I don't think it's worthwhile to work on AI safety research, because most of the research groups I've seen haven't put out useful work," every attendee will find a reason you should work on it anyway, and probably switch fields to try to join one of those research groups.

This sounds false to me. Have you tried it? 

Reply
[-]Seth Herd2mo1710

If your definition of cult is any set of beliefs that anyone has ever promoted in bad/dishonest/non-rational ways, then every organization and set of beliefs with more than about two adherents is a cult.

Reply
[-]James Camacho2mo-3-10

That is consistent with my view. I find some much more culty than others, and some culty in much worse ways than others, but yeah, every cause is a cult and everyone should switch to egoism.

Reply
[-]Archimedes2mo1512

Then saying rationality is a cult is mostly just a game of semantics that says almost nothing about rationality in particular.

Reply
[-]James Camacho2mo10

Sure, that's why it's useful to put numbers on how culty you think various institutions are. To throw out some completely random numbers, maybe Mormonism is a 9/10 while Rationalism is an 5/10. I'm not sure of the right way to scale the numbers, but it fits a model where Mormonism is much more culty than Rationalism which is much more culty than most of the scientific fields (the bottom of which is maths at 0/10).

Reply
[-]Richard_Kennaway2mo*20

Sure, that's why it's useful misleading to put numbers on how culty you think various institutions are.

FTFY. Otherwise you just get "You scored it 1/10! So it is a cult! You said so right there!" You've framed it that way by "counting down", adding up points of cultishness along multiple dimensions. Everything will look like a cult by that standard. Nothing is allowed to count as a positive sign of not-a-cult; not-a-cult is not even a concept in this ontology. All groups are guilty until proven innocent. No group can be proven innocent, they just haven't been found guilty yet. Just being a group is already worth a few points of cultishness, points that can never be removed.

Reply
[-]M. Y. Zuo2mo10

Considering all groups to at least have an incipient potential of cult formation seems sensible?

It might not literally be a point of “cultishness” on a scale out of 10, but on a scale out of 100 that seems more sensible. It is true after all the risk can never be reduced to perfectly zero as long as the group exists. 

I can’t think of any exceptions either…

Reply
[-]Richard_Kennaway2mo30

Considering all groups to at least have an incipient potential of cult formation seems sensible?

About as sensible as considering all people to at least have an incipient potential of, oh, insert anything you like here.

"at least have an incipient potential of" means nothing.

Reply
[-]AnthonyC2mo31

Then "is a cult" is not a useful way of describing the question, under your view. You'll want a quantitative (or at least ordinal) metric of cultishness, since almost nothing will be at either 0 or 100 percent. Otherwise this reduces to another "What exactly is a sandwich?" type discussion and we're just trying to decide where along a continuum it makes sense to start or stop using the word in practice.

You know, like the recent 1-10 scoring ASX's post on tight knit communities used.

Reply
[-]Karl Krueger2mo72

An anti-epistemology that goes around teaching people the concept "anti-epistemology" is a very weird sort of anti-epistemology, don't you think?

Reply
[-]frontier642mo67

I feel like a post that was seriously trying to make the case that rationalism is a cult would be significantly longer and more thought out. This post provides a random definition of a set, claims that set = cult, and then makes the true assertion that rationalism is in that set.

Reply
[-]Richard_Kennaway2mo40

As a constructivist, it seems they are promoting obviously bad beliefs, like an axiom of infinity, which leads to massive confusion around things like fractals, self-reference, and the universe at large.

Spoken like a true member of the constructivist cult!

Reply
[-]TAG2mo*41

You are missing a major issue by characterizing an anti epistemology as a set of abstract principles. What characterises cults is social epistemology...particularly "believe what the leader says" and "believe what everyone else believes". And rationalism certainly has a list of beliefs that are widely held, yet unproven, and it is a close knit group of people who socialise with each other , so the social epistemology mechanism is in play. Rationalism is nonetheless not close to being a fully fledged cult..

  1. Absolute authoritarianism without accountability
  2. Zero tolerance for criticism or questions
  3. Lack of meaningful financial disclosure regarding budget
  4. Unreasonable fears about the outside world that often involve evil conspiracies and persecutions
  5. A belief that former followers are always wrong for leaving and there is never a legitimate reason for anyone else to leave
  6. Abuse of members
  7. Records, books, articles, or programs documenting the abuses of the leader or group
  8. Followers feeling they are never able to be “good enough”
  9. A belief that the leader is right at all times
  10. A belief that the leader is the exclusive means of knowing “truth” or giving validation.

..It doesn't score 10/10 on any of these criteria, although it does score above zero on some (notably number 2). And, really, it's not good enough for an organisation dedicated to truth seeking to be avaragely cultish, because any amount of cultishness is inimical to truth seeking.

@Karl Krueger

Yes, rationalism is potentially self correcting. Errors can usually be explained by appealing to rationalist principles.

Reply
[-]SpectrumDT1mo10

However, a lot of people who don't get caught with Singer's thought experiment, and don't recognize the inconsistency with their previously held beliefs.

This sentence is not completely grammatical. It looks like the result of an editing mistake. I would love to know exactly what you intended to say here.

Smarter and more honest inductees will report that, "I care about having nice things for myself, and then my friends and family, much more than a random child," leave the room muttering, "cult," and not show up to the next EA reading club.

Could you please elaborate on why you think "smarter and more honest" people will give this answer?

Reply
[-]James Camacho1mo10
  1. The ellipsis is, "genuinely prefer others' well-being over a marginal increase in their own," from the previous sentence.

  2. They have to be smarter to recognize their actual beliefs and investigate what is consistent with them. They have to be more honest, because there is social pressure to think things like, "oh of course I care about others," and hide how much or little they care.

Reply
[-]SpectrumDT1mo10
  1. I still do not understand it. Could I ask you to please rephrase the sentence "However, a lot of people..." so it expresses what you intended to say, as exactly as reasonably possible?
  2. Are you saying that no one genuinely agrees with Peter Singer's conclusion (that you should sacrifice your own convenience to save a stranger)? And that everyone who claims to agree with Singer either (1) lies or (2) is too stupid to know their own beliefs?
Reply
[-]James Camacho1mo10

There are also people that genuinely prefer others' well-being over a marginal increase in theirs—mostly wealthy or ascetic folks—and I think this is the target audience of EA evangelism. However, a lot of people don't genuinely prefer others' well-bing over a marginal increase in their own (or at least, the margin is pretty small), but these people still end up caught with Singer's thought experiment, not realizing that the conclusions it leads them to (e.g. that they should donate to GiveWell) are inconsistent with their more fundamental values.

Reply
[-]SpectrumDT1mo10
  1. Are you taking into account the distinction between the preferences that people act on and the preferences that people wished they would act on? I don't know if there is any standard terminology for this (in philosophy or on LessWrong), but an obvious example is the smoker who struggles to stop smoking. It is possible to "prefer" one's own convenience and luxury, and at the same time want to be a "better person" by doing more for others.
  2. I get the impression that you think the majority of rationalists actually prefer their own convenience over effective altruism even though they maybe deluded or dishonest about it. What evidence do you base this on?
Reply
[-]James Camacho1mo-30

If you "want to stop smoking" or "want to donate more" but do not, you are either deluding yourself, lacking intelligence, or preferring ignorance. Deluding yourself can make you feel happier about yourself. "I'm the kind of person who wants to help out other people! Just not the kind who actually does [but let's not think about that]." Arguably, this is what you really prefer: to be happy, whether or not your thoughts are conistent with your behavior. If you are smart enough, and really want to get to the bottom of any inconsistencies you find yourself exhibiting, you will, and will no longer be inconsistent. You'll either bite the bullet and say you actually do prefer the lung cancer over the shakes, or actually quit smoking.

Are the majority of rationalists deluded or dishonest? Absolutely. As I said in my post, utilitarianism is not well-defined, but most rationalists prefer running with the delsuion.

Reply
[-]SpectrumDT1mo30

Your posts in this thread ooze with contempt for large swathes of people, on and off LessWrong. In this last post you are not doing much analysis; you are mostly just judging.

I get the impression that your reasoning here is motivated less by a desire to genuinely understand the human mind and more by a desire to dismiss people who disagree with you and feel superior to them.

Please note that this is NOT intended as an attack on you. It is intended as constructive criticism. I am suggesting that you could benefit from being more curious and more aware of your own biases.

Reply
[-]Austin Long2mo10

LessWrong's voting system might bury content which would otherwise make rationalists aware of inconsistencies, but it may also bury content which would otherwise convince rationalists to disregard flagged inconsistencies. I suspect that the voting system does more good than bad for group epistemics, but I think evidence is necessary to defend strong claims for either position.

Every group of people will have some features in common with the prototypical cult. I don't think it's useful to refer to rationalism as a cult because I doubt that it has enough cultish features. For example: there is no authoritarian leader, restrictions are not imposed on rationalists' contact with family and friends, etc.

Reply
Moderation Log
More from James Camacho
View more
Curated and popular this week
23Comments

(I realize many people here are part of this cult, and it may be upsetting to hear that people consider it a cult. But I figure "No, Rationalism is Not a Cult" needs a response.)

My definition of 'cult' is close to

An ideology that employs anti-epistemology to convince you to support it.

The hard part of this definition comes with the word "anti-epistemology". If a university math department teaches the ZFC set theory axioms, and asks their professors to use these axioms for formal proof verification rather than Type Theory (or another ideology), are they promoting anti-epistemology? As a constructivist, it seems they are promoting obviously bad beliefs, like an axiom of infinity, which leads to massive confusion around things like fractals, self-reference, and the universe at large[1].

The difficulty with declaring it anti-epistemic is that you cannot prove a system consistent from within the system. The best you can do is find an inconsistency. As far as we know, the ZFC ideology is internally consistent, so even if another ideology claims it promotes obviously bad beliefs, all we learn is the two ideologies are inconsistent with each other.

Most people find it difficult to hold inconsistent beliefs once they are aware of them, so most people would not be a ZFC set theorist and a constructivist at the same time. This is good, because it means ideologies that claim to have real-world consequences are being constantly tested when their members bump up against the real world. If a Young Earth Creationist learns about radioisotope dating, they may transition to a more modern version of Christianity. Of course, some of them just bite the bullet, but enough will reject the Young Earth that the ideology will not grow as fast as others (or even shrink), and soon be dominated by ideologies that mesh better with the real world[2].

As ideologies get bigger, it becomes more likely someone will discover an inconsistent belief, so the biggest ideologies have adapted to

  1. become more internally consistent, and
  2. make it less likely an inconsistency is brought to their believers' awareness.

This second adaptation is what constitutes "anti-epistemology", and makes an ideology a cult. If mathematicians discovered a contradiction in the ZFC axioms, it wouldn't become "metaphorical" and a "still useful belief, even if not literally true". It would get replaced, just like the set theory of the 1800s. Physicists are slightly more culty, and it has been said that the field advances one funeral at a time (Planck's principle), but they are usually pretty good about spreading awareness when an inconsistency is found. They are not hiding the inconsistency between general relativity and quantum mechanics from their own members, let alone the public.

Mormonism is staunchly on the other side of the cultiness scale. It has about as many true believing members as the physicists, but retains them mostly by keeping them unaware of inconsistencies. Do their scriptures say that polygamy is an everlasting covenant that everyone must follow to get to the top-tier heaven? Why yes, but their teachings gloss over this, maybe with a phrase about being temporary or necessary for the times. The members who notice a little confusion are met with apologetics that somehow interpret the clause as its own negation. But of course, very few people get to that point, as they're told that the internet is a very bad, anti-Mormon source of knowledge, and the proper channels for resolving questions is through prayer, the scriptures (just not that passage), and of course, church leaders. Most of the adaptations Mormonism makes—in how it teaches and what it teaches—are to make its members less likely to come across an inconsistency, and if they do, not realize it is an inconsistency.

Rationalism does this too, just not to the same degree. Here are a couple examples of adaptations the ideology has made that has the effect of hiding inconsistencies:

  1. LessWrong has a voting system where more established users get more votes. This is good, because you can build a reputation system, but also bad because part of that reputation is how closely you align with the ideology. Some ideas, like selfish egoism, just disappear in the downvotes, while others, like effective accelerationism, are mostly addressed in apologetica. Most LessWrongers are not even aware that sum-utilitarianism is not well-defined, and while I could steelman AI slowdown, I hardly know the arguments for e/acc.
  2. Effective altruism is always introduced by drowning a child. Smarter and more honest inductees will report that, "I care about having nice things for myself, and then my friends and family, much more than a random child," leave the room muttering, "cult," and not show up to the next EA reading club. Less critical people will bite the bullet and say, "I guess I shouldn't care about distance," and less analytical people will mostly be emotionally distraught about the drowning children everywhere. There are also people that genuinely prefer others' well-being over a marginal increase in theirs—mostly wealthy or ascetic folks—and I think this is the target audience of EA evangelism. However, a lot of people who don't get caught with Singer's thought experiment, and don't recognize the inconsistency with their previously held beliefs.

While these adaptations have the effect of hiding some inconsistencies, this seems to be mostly a side-effect of otherwise useful adaptations. The LessWrong voting system is superior to others, even if it creates ideological cementation. Singer's thought experiment is an easy introduction for effective altruism, even if it will convince some people to donate money in an inconsistent manner. Rationalism is a cult, but not intentionally[3].

Edit: As a couple comments pointed out, lots of groups employ anti-epistemology to get you to support them. Why should I care more when Rationalism does this than Safeway? It's due to the degree of impact. If Safeway doesn't actually have the best deals, my friends or I may lose $20. If earn-to-give and 80k hours isn't actually consistent with my or my friends' preferences, we may lose 20% of our wealth. Or, ideologically, Safeway's claims are relegated to a small, peripheral thought you may have the next time you go grocery shopping, while Rationalism's will dominate your decision-making process.


  1. Spacetime is a smooth manifold? How did you smooth it out? ↩︎
  2. From a solipsistic perspective, it's a little tricky to define the "real world", but let's just go with the distribution of logically possible universes that your thoughts could arise from, weighted by the Kolmogorov complexity to reach your thoughts (e.g. the standard model + evolution takes fewer bits to describe a path to your thoughts than a Boltzmann brain). ↩︎
  3. The stimuli that created the culty adaptations wasn't, "we're bleeding members, what holes do we need to plug?" but tamer things like, "how do we increase the quality of posts people see?" or "how do we introduce this unintuitive idea to people who aren't philosophy majors?" It makes sense to assign intent based on the problem being solved, so Rationalism is not intentionally culty, while Mormonism certainly is. ↩︎