There's an implicit assumption I sometimes encounter among rationalists that I find quite frustrating, and that makes it unnecessarily hard for the rationality community to absorb the useful bits of non-rationalist memeplexes. It goes somewhat like this: "If it doesn't sound like anything mentioned in the Sequences, it is irrational."

Enfant terribles of the rationalist community like Michael Smith or the so-called postrationalists got this, but tend to not explain it in terms that interface well with the rationalist memeplex. So here's a framework that might be useful for grokking the merits of non-rationalist memeplexes:

If you want to think of your brain as the hardware you run on, and of the memeplexes and habits you apply day-to-day as the software, you may consider the language games you use to be the operating system: Still software, but more foundational than the specific programs you decide to execute for tackling particular tasks. And just like with operating systems, not each language game is the most useful one for any real-world purpose.

Consider this quote by Alan Watts:

There are basically two kinds of philosophy. One’s called prickles, the other’s called goo. And prickly people are precise, rigorous, logical. They like everything chopped up and clear. Goo people like it vague. For example, in physics, prickly people believe that the ultimate constituents of matter are particles. Goo people believe it’s waves. And in philosophy, prickly people are logical positivists, and goo people are idealists. And they’re always arguing with each other, but what they don’t realize is neither one can take his position without the other person. Because you wouldn’t know you advocated prickles unless there was someone advocating goo. You wouldn’t know what a prickle was unless you knew what a goo was. Because life isn’t either prickles or goo, it’s either gooey prickles or prickly goo.

Talking like a Yudkowsky (or, like what you think a Yudkowsky would talk like) is remarkably useful for developing epistemic rationality and talking about prickly stuff. However, I've found other, more gooey language games more useful for some practical purposes than the common lingo of rationalists. For example, a remarkable in-group example for proper use of gooey language is Duncan Sabien's proposal to use the Magic: The Gathering color wheel as a personality type model. I find talking about energy (in human bodies and in rooms) absolutely remarkable for describing the phenomenology of emotions and nonverbal interactions, and for learning some of the more complex skills in that domain. The lingo of fairy tales and religion serves me well for talking about human relationships, values, and my sense of purpose. See for example the litanies & mantras of LessWrong, hpmor's "Mysterious Old Wizard"-trope, "young witches and wizards, heroes and heroines" (I sometimes encounter these and try to help them along their way). There are lineages of transimission (which point to the tacit knowledge of any subject that is hard to pick up other than watching more experienced folk in action). And, at times when a new DeepMind paper is published, I even tend to get a bit biblical: "My God, my God, why hast thou forsaken me?"

Of course, gooey models are so appealing to the human mind that they tend to become a gateway for false beliefs. That's why it is useful to try and reconstruct gooey memeplexes in more prickly terms, like Kaj Sotala did in his multiagent models of mind-sequence. But I don't think that the fact that nobody has yet come up with a sufficiently prickly, sequence-ey wording for something is strong evidence for it being useless, evil, or dangerous.

Talking like the Sequences is like loading a 200 pound barbell on your shoulders: It is outstanding training for your leg and core muscles, but if you try to run a marathon like that, you have to have attained demigodhood (see what I did there?), or you will collapse. So, don't be a straw vulcan, rationality cultist Apple snob: Become operating system-savvy, and do the instrumentally rational thing by using each OS exactly for what it's best at.

So, what is a gooey language game you like to play that costs you status points in the rationalist in-group?

12

New Comment
14 comments, sorted by Click to highlight new comments since: Today at 3:25 AM

Consider this quote by Alan Watts:

There are basically two kinds of philosophy. One’s called prickles, the other’s called goo. And prickly people are precise, rigorous, logical. They like everything chopped up and clear. Goo people like it vague. For example, in physics, prickly people believe that the ultimate constituents of matter are particles. Goo people believe it’s waves.

*facepalm*

If one has a technical understanding of QFT[1] (or even half of technical understanding, like me), this sounds totally silly.  There's no real question as to whether things are fundamentally "particles" or "waves".  There's nothing "goo" about waves either; waves refer to mathematically precise phenomena.  Physics is precise and mathematical ("prickly" if you like).  Anyone who understands QFT must have a prickly side, and can deal with both waves and particles without confusion.  Those who think physics can be "vague" don't understand the physics.

(Critiquing Watts of course, not primarily the author of this post)

  1. ^

    Quantum field theory

I mean… what you're actually criticizing is that Alan Watts is a goo philosopher.

He's not trying to be precise or carefully define what he's talking about. He's instead using loose metaphors to convey a feeling.

And your objection is that his loose metaphors are pointing at things that have precise definitions and therefore he's technically mistaken in how he's trying to illustrate his point.

To which a goo person would shrug. Because they understood the message, and that's the real point. Not technical accuracy of the words & metaphors.

So in a funny way, you're actually illustrating Watts' point.

I find talking about energy (in human bodies and in rooms) absolutely remarkable for describing the phenomenology of emotions and nonverbal interactions, and for learning some of the more complex skills in that domain. 

I agree. The word energy is useful to describe things that can't be measured in Joule. When doing machine learning, nobody has a problem calling something temperature, that very obviously can't be measured in Kelvin. 

I had one physiology lecture with a professor who was trying to teach us cybernetics. He had no problem with calling one parameter energy even when it quite obviously wasn't measured in Joule. He didn't seem bothered at the least by it or had any sense of breaking any taboos. 

It turns out that if you want to talk about a new domain of knowledge it's often very useful to recycle concepts like energy or temperature from another domain of knowledge.

Er. Not to vanish too far up my own navel, but the place this post lost me was when it implicitly proposed a strict dichotomy between prickles and goo, which seems like an obvious misdirect/misstep/Red Herring.

I think there's a thing where, like, many people have failed to be able to make these two things compatible in their own minds (and it's true that when you look out at the world you see a lot of people that seem pretty tribally on one side or the other), so they think of them as two necessarily separate things.

But this post reinforcing what seems to me to be a false dichotomy was pretty =/.  Like, ending it with "let's focus on especially gooey stuff that especially irritates this prickly crowd" seems counterproductive.

Agree with this. 

I think there's an occasionally occurring thing where people will post "gooey stuff" on LW and it's rejected not because it's gooey but because it's overconfidently and/or sloppily argued. Then in some cases, rather than responding to the criticisms, the authors will attribute the negative reaction to the gooiness of the post rather than to the fact that people had reasonable criticisms the authors weren't willing to address. In turn, LW readers will see that the people making the gooey posts are unable to take criticism, and make the inference that all of the gooey stuff just melts people's brains and makes them unable to reason rationally... making them more likely to actually be prejudiced towards more gooey stuff in the future.

But this only happens because there's a perceived dichotomy between gooey and prickly stuff in the first place. If the people posting the gooey stuff wouldn't make a big deal out of how their gooey stuff is more sophisticated than the simple-minded prickly readers are capable of comprehending, the bias against gooey stuff wouldn't get formed in the first place. Rather the bad articles would be seen just as bad articles, rather than tainting all the gooey stuff by association.

I fear that this post is further playing into the same dynamic. Its tone reads to me as slightly strawmanny and condescending, further reinforcing the frame of, depending on who you ask, "the gooey people who get it vs. the unenlightened prickly people who need to be taught that non-prickly things can be useful too" and "the gooey people who think of themselves better as prickly people when they're actually just more sloppy thinkers vs. the prickly people who haven't melted their brains and can still reason sensibly".

Hah, this makes a lot of sense. Thanks!

An addition to that: If we look through the goggles of Sara Ness' Relating Languages, the rationalist style of doing conversations is at the far end of the internal-focusing dialects Debater/Chronicler/Scientist. In my experience, more gooey communities have way more Banterer/Bard/Spaceholder-heavy types of interactions, which focus more on peoples' needs in the situation than on forming and communicating true beliefs. People don't necessarily know which dialects they speak themselves, because their way of interacting just feels normal to them, and everyone else weird. It's hard to learn speak in dialects that are not your natural default. For example, I didn't even notice myself slipping into Bard/Banterer during writing this post, but in hindsight it's fairly obvious how it digresses from the LessWrong language game.

I think the LW-way is ideal for its purpose, but I'm realizing that there's a whole lot of tacit knowledge and implicit norms involved in understanding and doing it. This strong selection for a particular style of communication may be responsible for a significant chunk of the difficulty I'm perceiving in interfacing between the rationalist and other memeplexes. In both directions, both for the rationalist community learning from other memeplexes, and for useful memes getting from rationalist circles into the outside world.

Thanks for the input!

It wasn't my intention to reinforce this dichotomy. Instead, I hoped to encourage people to name things that break the rationalist community's Overton window, so that others read them and think "Whoopsie, things like that can actually be said here?!" I suspect that way more people here picked up useful heuristics and models in their pre-rationalist days than realize it, because they overupdate on the way of the Sequences being the One True Way. I've learned in other communities that breaking taboos with questions like these is a useful means for breaking conformity pressure. My hope was that eventually, this helps a little to reduce the imbalance towards prickliness I perceive in the rationalist community, and with that this dichotomy. 

Apparently, I haven't yet figured out how to express and enact intentions like these in a way that fits the rationalist language game.

Instead, I hoped to encourage people to name things that break the rationalist community's Overton window, so that others read them and think "Whoopsie, things like that can actually be said here?!"

Is it really the case that such things are outside the Overton Window, though? We've had both well-received posts discussing how to incorporate goo-y stuff before [e.g. 1, 2, 3, 4, 5] as well as various posts expressing things in pretty goo-y terms [e.g. 1, 2, 3, 4, 5]. I don't think LW at least has any taboo against saying these kinds of things; writing in an unusual style might invite some extra scrutiny, but generally the posts will still be received well as long as they're reasonable and well-argued.

It seems to me that there are discussions around the term energy that are blocked by being outside of the Overton Window. I think I had two discussions with Severin in which I needed the concept because it was lifting weights and in both, there was a sense of anxiousness about breaking out of the Overton Window.

How to learn soft skills  (first on your list) seems like the perfect example. It uses the term energy once but it does so in a pretty Straussian manner. 

I do agree that there's often a useful intermediate step for escaping the false dichotomy that's something like "do both A and ¬A." And then, once you have experiential data of each, you can see the ways that the A/¬A dichotomy was fake and not helpful.

But also I worry about people seeing sentiments like the one immediately above, and doing a fallacy-of-the-gray thing, and thinking it means something like "precision doesn't matter."

Precision (and similar stuff) does matter! It's just not the enemy of the-thing-being-called-goo.

Precision (and similar stuff) does matter! It's just not the enemy of the-thing-being-called-goo.

Well… in practice it kind of is.

There's totally a thing where a focus on precision can result in people precluding "goo" and actively attacking attempts to communicate in gooish.

I mean, this is basically what "normies" find annoying about autistic people.

It doesn't have to be this way. I totally agree, precision totally matters. There's a kind of flow between precision and "goo" that seems vastly more functional and fun than either one alone. They can support one another super well.

But to say that precision isn't the enemy of the-thing-being-called-goo seems like it's glossing over a real sociological thing.

Well, to be precise, I said a compound thing which included:

(and it's true that when you look out at the world you see a lot of people that seem pretty tribally on one side or the other)

... so I don't think I fully glossed it over. =P

I like the first paragraph a lot, but I haven't encountered the described simplistic rigidity that much (partly or mostly because I ignore a lot of "the community" parts of discussions), and the analogies don't click for me.  Gooey and prickly is kind of evocative, but I can't figure out how to use it.  The hardware/os/software analogy is just weird, I get hardware/firmware/software as a model, but it doesn't map the way you describe.

New to LessWrong?