by [anonymous]
1 min read21st Oct 201211 comments

34

A short argument from an interesting blog.

Anti-Groupism

Basic Aretaevian talking points:

  1. Human brains are effectively populated by rabbits.  Your conscious mind is like a very small person attempting to ride a large herd of rabbits, which aren't all going the same direction.  Your job is to pretend to be in control, and make shit up to explain where the rabbits went, and what you did.
  2. Humans bunny brains are optimized for social activity, not intellectual activity.  If your brain thinks principles first, instead of groups first, it's broken, and not just a little bit.
  3. Of course, this means that anyone thinking group first is almost completely full of crap regarding their reasoning process.  They're (99.86% certainty) making shit up that makes the group look good, and the actual rational value of the statement is near zero.  The nominal process "A->B->C" is actually C, now let's backfill with B and A.  
  4. Therefore I'm almost only interested in listening to folks who are group-free.  If your brain is broken in the kind of way that prohibits group-attachment...then you're far far more likely to be thinking independently, and shifting perspectives.
  5. Aside:  FWIW, this is the core (unsolvable?) problem that inhabits rationalist groups.  There is a deep and abiding conflict between groupism and thinking.  The Randians have encountered this most loudly, but it's also there in the libertarians, the extropians, the David Deutch-led popperian rationalists, and the LessWrongers.
New discovery, shouldn't have been as surprising as it was.   When looking for folks who are group-avoidant, I seem to have phenomenally good luck finding great people when talking with Gays from non-leftist areas (rural Texas, Tennessee, downstate Illinois).  Because they don't/can't fit in with their local culture, and often can't conveniently exit, they become interesting people.   It's a surprisingly good metric.

New to LessWrong?

New Comment
11 comments, sorted by Click to highlight new comments since: Today at 5:58 PM

Humans, being awesome, can walk and chew gum at the same time. Metaphorically, even. And it's even easier when sometimes you want to walk, and sometimes you want to chew gum. Salient example: Alicorn mentioned "Illuminati meetings" as fun social events in her recent article.

[-][anonymous]11y40

As a note, there is a followup post on the next day which I've posted below:

http://aretae.blogspot.no/2012/10/responding-on-rabbits.html

Responding on Rabbits

In the comments of the last post, Dr. Pat expands my thinking about bunnies notably. On the other hand, I'm not willing to step away from my explanation just yet. Here's my "just so story".

Most people have a group of 5 bunnies that are rather muscular bunnies that focus on group dynamics, group belonging, etc. Their preferences are aligned enough that they usually pull in the same direction. In practice, this means that in conflicts, this particular group of bunnies gets their way most of the time. There is also another bunny who is usually weak and sickly (or a frog) who checks for ideational consistency. That frog usually moves backwards.

In some rare folks, the frog is unusually muscular. Not a normal frog or even a bullfrog, but a big-ass pixie frog who eats rats. He gets what he wants a little bit. Or he has a buddy: 2 giant pixie frogs. These people would land in what Simon Baron Cohen (autism researcher) talks about as high on the systematizing scale. Now, some other rare folks would have group bunnies that were sick...they had polio as baby bunnies. One of the 5 died. The other 4 are crippled and can't walk effectively.

If you run into a person who (a) has crippled group bunnies, and (b) has giant pixie-frogs...then you get a different approach to cognition than you see in most.

That doesn't say it's better.

FWIW, the book that most informed my thinking on Rabbit-Brains is "Everyone (Else) is a Hypocrite" by Robert Kurzban. Fabulous book. Rabbits are my wording.

[-]TimS11y30

Therefore I'm almost only interested in listening to folks who are group-free.

Is there any reason to think that the population under discussion is "group-free" as opposed to being a member of a non-mainstream group? In other words, might the "group-free" be just as mind-killed about their personal identity as a more mainstream mind-killed group (like gun-owners or environmentalists)?

I'm also loving that blog; it was one of about three in the big "list of LessWronger blogs" discussion post that seemed to be worth working through the archives.

Points 1 and 2 are pretty hard to dispute; my thoughts first turn to the split-corpus-callosum experiments regarding #1 and the difference between abstract and non-abstract Wason selection task results regarding #2.

Point 3 may be slightly wrong. For instance: If you want to show a cohesive group front, you need Schnelling points to rally around, and "this is a rational argument" makes a good such point, which works to the extent that people try to develop ways to distinguish between rational and rationalization. We have incentives to learn to trick people via rationalization, but also incentives to learn how to avoid being tricked.

The first sentence of point 4 is likely wrong, ironically because of point 2, due to the base rate fallacy. If the vast majority of people don't have broken intellectualism-before-group-identity brains, then the majority of good ideas may still come from them, despite their lower per-capita production of those ideas.

The second sentence corresponds to my experience, but it may often be wrong because point 3 hasn't been taken far enough: how many iconoclasts are really "group-free", and how many are just following a different subconscious social pattern like "if you can't climb the group hierarchy from within, conspicuously oppose it and found a new hierarchy"?

Posting here is the closest I've ever come to joining a "rationalist group", so I can't usefully comment on point 5.

Anti-groupISM? AretaeVIAN?

This is similar to some of my thoughts recently on epistemic versus social rationality. Haidt has different moral modalities, I'd extend that to different truth modalities. There's a tradeoff between epistemic truth and social truth, which those with a bias for epistemic truth can best analyze, while they are otherwise handicapped in social situations.

I'm not confident I know what you mean by "social truth". Can you break that apart?

Here's a longer and more contextualized comment on the sam: http://lesswrong.com/lw/eqn/the_useful_idea_of_truth/7jyn

I'd break apart instrumental rationality instead, because grouping what is in there is less important than identifying what is in there. I made a start at that above. Epistemic truths allow for accurate modeling of the world. Very instrumentally rational (note epistemic rationality as a sub type of instrumental rationality), but there are ways in which a belief can be useful besides modeling.

Here are some ways a belief can be socially useful:

Back to what I can do with a belief, I can tell it to my neighbor. That becomes a very complicated use because it now involves the interaction with another mind with other knowledge. I can inform my neighbor of something. I can lie to my neighbor. I can signal to my neighbor. There are quite a number of uses to communicating a belief to my neighbor. One interesting thing is that I can communicate things to my neighbor that I don't even understand.

What I would expect, in a population of evolved beings, is that there'd be some impulse to judge beliefs for all these uses, and to varying degrees for each usage across the population.

The signaling aspect of beliefs is likely the most socially powerful aspect.

I get all this, I think. I didn't realize you were equating "socially useful" and "socially true."

I guess those might feel very similar; that one's experience of the social use of a belief could feel a lot like truth. In fact, a belief seeming socially useful, a belief seeming not to cause cognitive dissonance, and a belief seeming epistemically true might be the same experience in other people's heads - say, a belief feeling "right."

Despite knowing this, I still feel deeply wronged and get filled with negative emotion whenever I see or hear the phrases "social truth" or "socially true". A bit like watching someone get raped or pushed onto the tracks of an incoming train or something.

Thanks, your comment was useful. This helped me reorder and re-estimate my values a bit.

[-]TimS11y00

I agree with you - social truth or intersubjective truth are different things than empirical truth and pretending otherwise is misleading.