I have recently been thinking about this question, "what is it exactly that helps install religious software so deeply and dogmatically into the brain?" Often those who are strongly religious fall into a few categories: (1) They were trained to believe in specific aspects of religion as children; (2) They entered into a very destitute part of their lives (i.e. severe depression, midlife crisis, loss of a job, death in the family, cancer, alcoholism, or other existential problems).

What strikes me about these situations is that emotion generally dominates the decision-making process. I remember when I was a child and attended church camp at the encouragement of my family I was heavily pressured by the camp counselors to "accept Christ" and I saw that there was a positive correlation between my willingness to accept Christ, memorize Bible verses, and say certain statements about behavior in the context of Christian morals and the way that the camp counselors, my extended family, and other adults would treat me. As a result, it was not until many years later that my preference for rationalism and science was able to fully crack that emotionally-founded religious belief installed in me as a child. I know many people for whom a similar narrative is true regarding experiences with alcohol, etc., though it seems to be rare for someone to completely dismiss deeply and emotionally held beliefs from their youth.

Emotion is something we have evolved to utilize. Generally speaking, we need emotion because we have to make split-second decisions sometimes in life and we don't have the opportunity to integrate our decision process on data. If someone attacks me I will become angry because anger will raise my adrenaline levels, temporarily reduce other biological needs like hunger or waste removal, and enable me to fight for survival. Essentially emotion is just a recorded previous decision that works on stereotypical data, or in probabilistic terms it is like basing a quick decision on solely the first moment of a bunch of previously experienced data. The first moment might not be the best descriptor of the data... but if you're in a computational bind you might not be able to do a whole lot better and you'll be biologically penalized for spending your CPU time trying to compute better descriptors of the data. But it is undeniable that decisions we all make based upon emotion are often some of the most powerful and deepest-seated beliefs that we have.

With religion this is especially true. Very religious people, in my view, have this software installed emotionally and then spend years practicing the art of pushing the installed software ever closer to the very act of perception itself, until at some point it is almost the case that sensory data is literally passed through a religious filter before it is even processed and presented for perception. A sunset becomes a symbol of God's love so much so that there is (almost) no physical distinction between the literal viewing of photons depicting the sunset scene and the thinking of the thought "This shows that God loves me." Emotionally installed software presents a very difficult problem. Depending on how close to the act of perception that it has been pushed, it implies there is a remarkably tiny window of opportunity for the presentation of data that could convincingly demonstrate that rational alternatives are better in a number of important senses.  

I'm sure many of you have had debates where you've run into circular logic and unavoidable walls that stifle all useful discussion. Can we as a community come up with a good theory on how sensory data can help to uninstall deep emotionally installed software in someone's brain? I really feel that this is an area that deserves some philosophical attention. Is it the case that if software is installed in someone's brain in conjunction with emotion (and by this I literally mean that the cyclic AMP cycles and other biological processes used for memory formation are made stronger and synaptic connections related to the library of belief concepts (e.g. religious) are reinforced by chemicals released in conjunction with the emotive force of the experience in which they are formed) can only be uninstalled by a similarly impactful emotional experience? It appears that slow-moving rationality and logical discussion are almost physically powerless to succeed as convincing mechanisms. And if this is the case, what should rationalists do to promote their ideas (aside from the obvious social pressure to stop installing religious software in the minds of children, etc.)

Note that in the discussion above I use 'religion' as a specific example, but any irrationally held belief that derives from an emotionally impactful experience would serve the same purpose. And also, here we can assume 'religious' refers to ontological claims unsupported by any evidence and then purported to have day-to-day impacts on life and decision-making. I would be very grateful for any thoughts the community has and hopefully we can generate some useful techniques for understanding how to appropriately uninstall emotional software (in the instances when it's useful to do so)... even the kinds of emotional software that we ourselves (rationalists) often fall victim to in our own imperfect understanding of the world.

New Comment
14 comments, sorted by Click to highlight new comments since: Today at 3:34 AM

The paper cited in swimmer963's Action and Habit post about how things learned under stress tend to be performed habitually seems to explain well enough how the installation happens, if you assume that the emotion discussed in the OP is close enough to the stress discussed in the paper. (Be a little bit careful there, since Swimmer963 and I seem to agree that he misread the paper the first time around.)

I would be very grateful for any thoughts the community has and hopefully we can generate some useful techniques for understanding how to appropriately uninstall emotional software (in the instances when it's useful to do so)... even the kinds of emotional software that we ourselves (rationalists) often fall victim to in our own imperfect understanding of the world.

Hassan's Combatting Cult Mind Control describes the best way out I'm aware of. Support the mind control victim emotionally (to minimize activating the stress-learned habit) and talk to them about firsthand experience of escaping from some other cult, and they can often use that to help find their own way out. The full generalization of this seems to be Ericsonian hypnosis and communication-by-metaphor, which are much discussed in NLP circles.

I have little experience with actually doing this. I talked my wife out of her Christianity early in our relationship, but she wasn't very committed to it to start with and I didn't follow any particular process, so that should barely count.

Can we as a community come up with a good theory on how sensory data can help to uninstall deep emotionally installed software in someone's brain?

To start with, see:

http://en.wikipedia.org/wiki/Deprogramming and http://en.wikipedia.org/wiki/Exit_counseling

[-][anonymous]13y00

From what I can gather, wouldn't a rationalist overwhelmingly utilize exit counseling almost exclusively (as in, only extreme, pathological situations would seem to ever merit use of force for deprogramming). Another issue I find is that the article on exit counseling specifically categorizes the belief system from which it is sought to remove a person as "a group perceived to be a cult." My guess is that for mainstream, widely-held beliefs, transhumanists or even just "militant atheists" would be described as the cult in the scenario. It's hard to approach the task of convincing a loving grandmother who has attended church for decades, yet fritters away needed retirement income on church-related donations, that some components of her belief system, however emotionally important, are detrimental.

I realize that we have a slider bar that covers a spectrum of departure from what is acceptable by society at large. At the far, fringe end of that slider bar, it may become necessary to do these extreme things. But my interest lies more with people who by and large exhibit perfectly sane views about the world and in most ways they choose their behaviors as if supernatural phenomena had no impact, but that may make some decisions (such as what legislation to support) due to a deeply held emotional belief that, say, abortion is unequivocally wrong, etc.

Or someone who defends creationism in a part of the world where there is a lot of sympathy toward creationism. If these are emotionally held beliefs, then the viewing of radiometric dating evidence or fossil evidence or cosmological evidence will be intercepted by a religious filter long before it gets to the higher cognitive processes of the brain.

I definitely agree that the linked tactics can be helpful in extreme cases where a belief is held so strongly that it borders on mental illness (in the eyes of society, not just a subset of rationalists). And I agree with the other comment that for humanity at large the best path forward is to teach children about rationalism from a young age so that they don't fear it and don't grow up believing that morality, ethics, and life-fulfillment are intrinsically attached to faith and belief.

But more fundamentally, is the consensus just that if Bob or Alice believes in wacky-religious-concept-X and they are civil about it but unresponsive to verbal arguments about evidence then we should just abandon Bob and Alice to their beliefs?

What is the best projection of Johnny Appleseed into the rationalist world?

I've never, even among fundamentalists of various stripes, heard extreme atheism described as a cult, nor seen atheists subjected to treatment appropriate to cult members. I've heard it characterized as a religion, but that always seems to be more an attempt at leveling (i.e. "well, you have unjustified beliefs too") than a serious try at thinking of atheism in terms of a cultus.

(A lot of the Christian fundamentalist community's more extreme approaches to homosexuality do have a lot in common with deprogramming methods, incidentally -- so this doesn't just reflect a general distrust of the concept.)

Same goes for broader senses of transhumanism. I have heard singularitarianism described as a cult, but I'm not sure that necessarily indicates anything other than the absurdity heuristic throwing up positive results. The Overton window, or your slider bar, are pretty coarse metaphors; they don't necessarily imply specific approaches (i.e. cult deprogramming) to worldviews on their fringes.

[-][anonymous]13y00

I've never, even among fundamentalists of various stripes, heard extreme atheism described as a cult, nor seen atheists subjected to treatment appropriate to cult members.

I have heard this accusation, though I have not seen any actions from those espousing atheism/transhumanism that could reasonably fall under the category 'cult'. My only point was that in an environment where the majority holds at least some tenant of major religious belief, a person attempting deprogramming or exit counseling might appear to the majority as the one in need of help.

"Are you guys some sort of cult?" "Yes indeed, but we only accept donations in Bitcoins."

From what I can gather, wouldn't a rationalist overwhelmingly utilize exit counseling almost exclusively (as in, only extreme, pathological situations would seem to ever merit use of force for deprogramming).

Only if you accept a definition of deprogramming that includes the implication of coercion[]. IMHO, deprogramming* is - or should be - a general term with no implication of coercion.

Linking deprogramming to coercion seems likely to be a negative marketing move by exit counsellors (who pride themselves on not using coercion).

Aside from the FUD, "deprogramming" seems like a better, more general term than "exit counseling".

[*] Even then if you want a rescue attempt to succeed, drastic measures may be necessary - cults can put up a fight.

Linking deprogramming to coercion seems likely to be a negative marketing move by exit counsellors (who pride themselves on not using coercion).

This is plausible, but is not how it happened. Historically, "deprogramming" became associated with coercion, so "exit counselling" was set up to do the same thing without coercion.

I appreciate your concerns that "deprogramming" shouldn't imply coercion, but in normative usage it does.

I appreciate your concerns that "deprogramming" shouldn't imply coercion, but in normative usage it does.

Not really. Here are some better definitions:

To counteract or try to counteract the effect of an indoctrination, especially a religious or cult indoctrination.

  1. to free (a convert) from the influence of a religious cult, political indoctrination, etc., by intensive persuasion or reeducation.

  2. to retrain, as for the purpose of eliminating or replacing a learned or acquired behavior pattern or habit that is undesirable or unsuitable.

Some people are trying to screw up this perfectly good word. I suspect that is probably for marketing reasons. Or maybe it was just a misunderstanding. Whatever reasons there are, they are not anything to do with good terminology. The proposed meaning involving coercion necessarily being involved totally sucks. I recommend not promoting such ugly nonsense.

[-][anonymous]13y00

Only if you accept a definition of deprogramming that includes the implication of coercion. IMHO, deprogramming is - or should be - a general term with no implication of coercion.

I totally agree, and to avoid confusion over words, I was merely using the definition of deprogramming from the Wikipedia link you provided, where it says:

Deprogramming refers to actions that attempt to force a person to abandon allegiance to a religious, political, economic, or social group. Methods and practices may involve kidnapping and coercion. Similar actions, when done without force, are called "exit counseling".

It doesn't matter to me which words we choose to refer to these concepts with, as long as we make a distinction between types of exit counseling / deprogramming that involve coercion and types that don't. Making that distinction with the two phrases "deprogramming" and "exit counseling" seems as good as any to me, although "coercive deprogramming" and "noncoercive deprogramming" is more clear to the layman and allows exit counseling to claim itself as a subset of noncoercive deprogramming.

I was merely using the definition of deprogramming from the Wikipedia link you provided, where it says:

Deprogramming refers to actions that attempt to force a person to abandon allegiance to a religious, political, economic, or social group. Methods and practices may involve kidnapping and coercion. Similar actions, when done without force, are called "exit counseling".

Yes, but this is very bad terminology because it conceals unnecessary technical gumph (coercion) behind a perfectly ordinary-looking word (deprogramming). Of course, it should be:

(deprogramming (coercive deprogramming) (non-coercive deprogramming / exit counseling))

Another issue I find is that the article on exit counseling specifically categorizes the belief system from which it is sought to remove a person as "a group perceived to be a cult.

True enough. It is still interesting to see high-powered approaches, even if the meme therapy you have planed is more basic.

Similarly, if you plan a diet, you might want to check out how these folks recommend doing it.

"what is it exactly that helps install religious software so deeply and dogmatically into the brain?" Often those who are strongly religious fall into a few categories: (1) They were trained to believe in specific aspects of religion as children; (2) They entered into a very destitute part of their lives (i.e. severe depression, midlife crisis, loss of a job, death in the family, cancer, alcoholism, or other existential problems).

Are there actually any studies on this? It would be awesome if there were.

As for uninstalling such software, it seems like one way to do it would be to teach them rationality without mentioning religion, so that they don't see rationality as a threat. Hopefully at that point they'll already have the skills to see that their religion is false, and it will just fall away. Eliezer talks about this in Raising the Sanity Waterline.

Apposite, a Reddit article I found recently:

Atheism vs Theism may seem like a battle of wits involving only science, and debate. The real truth is far deeper and darker than this, and anyone who considers discussing atheism with a "person of faith" should consider this.

There is considerable emotional pain in realising there is no God. What can we do to ease this pain?

(The article expands on the notion that you can't reason your way out of something you didn't reason yourself into. At least, not generally.)