There's an implicit assumption I sometimes encounter among rationalists that I find quite frustrating, and that makes it unnecessarily hard for the rationality community to absorb the useful bits of non-rationalist memeplexes. It goes somewhat like this: "If it doesn't sound like anything mentioned in the Sequences, it is irrational."
Enfant terribles of the rationalist community like Michael Smith or the so-called postrationalists got this, but tend to not explain it in terms that interface well with the rationalist memeplex. So here's a framework that might be useful for grokking the merits of non-rationalist memeplexes:
If you want to think of your brain as the hardware you run on, and of the memeplexes and habits you apply day-to-day as the software, you may consider the language games you use to be the operating system: Still software, but more foundational than the specific programs you decide to execute for tackling particular tasks. And just like with operating systems, not each language game is the most useful one for any real-world purpose.
Consider this quote by Alan Watts:
There are basically two kinds of philosophy. One’s called prickles, the other’s called goo. And prickly people are precise, rigorous, logical. They like everything chopped up and clear. Goo people like it vague. For example, in physics, prickly people believe that the ultimate constituents of matter are particles. Goo people believe it’s waves. And in philosophy, prickly people are logical positivists, and goo people are idealists. And they’re always arguing with each other, but what they don’t realize is neither one can take his position without the other person. Because you wouldn’t know you advocated prickles unless there was someone advocating goo. You wouldn’t know what a prickle was unless you knew what a goo was. Because life isn’t either prickles or goo, it’s either gooey prickles or prickly goo.
Talking like a Yudkowsky (or, like what you think a Yudkowsky would talk like) is remarkably useful for developing epistemic rationality and talking about prickly stuff. However, I've found other, more gooey language games more useful for some practical purposes than the common lingo of rationalists. For example, a remarkable in-group example for proper use of gooey language is Duncan Sabien's proposal to use the Magic: The Gathering color wheel as a personality type model. I find talking about energy (in human bodies and in rooms) absolutely remarkable for describing the phenomenology of emotions and nonverbal interactions, and for learning some of the more complex skills in that domain. The lingo of fairy tales and religion serves me well for talking about human relationships, values, and my sense of purpose. See for example the litanies & mantras of LessWrong, hpmor's "Mysterious Old Wizard"-trope, "young witches and wizards, heroes and heroines" (I sometimes encounter these and try to help them along their way). There are lineages of transimission (which point to the tacit knowledge of any subject that is hard to pick up other than watching more experienced folk in action). And, at times when a new DeepMind paper is published, I even tend to get a bit biblical: "My God, my God, why hast thou forsaken me?"
Of course, gooey models are so appealing to the human mind that they tend to become a gateway for false beliefs. That's why it is useful to try and reconstruct gooey memeplexes in more prickly terms, like Kaj Sotala did in his multiagent models of mind-sequence. But I don't think that the fact that nobody has yet come up with a sufficiently prickly, sequence-ey wording for something is strong evidence for it being useless, evil, or dangerous.
Talking like the Sequences is like loading a 200 pound barbell on your shoulders: It is outstanding training for your leg and core muscles, but if you try to run a marathon like that, you have to have attained demigodhood (see what I did there?), or you will collapse. So, don't be a straw vulcan, rationality cultist Apple snob: Become operating system-savvy, and do the instrumentally rational thing by using each OS exactly for what it's best at.
So, what is a gooey language game you like to play that costs you status points in the rationalist in-group?