Thinking of our epistemically troubled friend

by MakoYass1 min read17th Aug 202110 comments



I used to have a friend who got into this pattern of.. I guess.. taking in and propagating a lot of signals that were very rousing and suggestive on the surface, and not checking anything with enough detail to tell whether the surface impressions were really true or not.

They found so many surfaces all agreeing that, say, lizard people are real (neutral example, not sure whether they believed that one, but they believed comparably strange things), and usually we'd just try to have fun with it, but sometimes we would do the work they weren't doing and show them how a signal had to be false, they'd just be like, "It doesn't matter if that one was wrong, what about all this other stuff that I saw."

Well, I imagine that's what they were thinking. They'd become too mean and too hubristic to say anything that nice to us.

It was like they didn't believe that trends of falsehood found in randomly sampled subsets of the population of claims could be generalized over the entire population. Like they didn't believe in surveys. Or maybe they thought we weren't doing a fair random sample, that we were just honing in on trivial details to antagonize them.

But we weren't.
It was always the most interesting claim (or the only interesting claim) in these posts that turned out to be bullshit.

It was always the claim that I put time and energy into investigating that lead to nothing but disappointment, but when I told them about it, it never seemed to move them, or to lead them to question these sources in general, they acted like I was missing some larger point, and they kept posting things of this pattern, and always it was the most surprising part that was false, and eventually I just had to stop listening.

Some of my NT goblins still talk to them. I wonder how they are. I ask the goblins why we couldn't save them. We were all in the halls of rationalism, they had read the sequences, at some point, they had read "arguments as soldiers". It hadn't been enough. What was missing. What had we failed to convey.


10 comments, sorted by Highlighting new comments since Today at 7:06 PM
New Comment

Did you ever spell out the meta and ask them? "It seems to me like you believe false things too easily. Obviously it doesn't seem like that to you. Is there a test we could do that would convince me you're right or convince you I'm right depending on how the test turned out? Like, maybe you could pick 20 beliefs that aren't mainstream and guess how many will hold up if we investigate closely, together, and I could also guess, and if you guess something like 18 and I guess something like 4 and we investigate and jointly decide the right answer was 15 then I admit I have a problem or if we jointly decide the right answer was 5 then you admit you have a problem?"

In practice I could see this working quite well with many of my friends who ... don't believe false things too easily. And it would be a "look at you weirdly and walk away non-starter" for the folks I know who I think do believe false things too easily. So ??? but I'm still curious whether you did try direct communication about the meta rather than a grab bag of concretes.

Is there a test we could do that would convince me you're right or convince you I'm right depending on how the test turned out?

We did get to a point where we wanted them to take literally any formal bet to demonstrate that they even believe the things they are saying and that there is a real object-level disagreement, and they wouldn't, anticipating malicious fuckerry no matter the terms. That was where I broke off. I suppose, maybe that was one important piece of rationalist culture they did not have. Uncooperative in engaging the disagreement in a respectful, grounded way.

I suppose it would be more suitable if instead of wagering money we wagered duties, as wagering money isn't really appropriate between friends, and probably makes them feel insecure.
Something like "If your surface reading of this claim turns out to be wrong after it has been more deeply investigated, then you must investigate three more claims. And if two of those are wrong, you must investigate 5 more" or something like that. "If you are right, then we must listen to more of your shit (if you refuse to bet, then some of us definitely aren't listening to more of your shit)"

Between friends I usually wager a sandwich or a cup of coffee. Enough to make it clear that a specific bet is being articulated and agreed upon, but not enough to really hurt anyone's feelings if they lose.

"NT goblins"?

Maybe NT means "neurotypical"? or NT as in xNTx Myers-Briggs horoscope personality types? "Goblins" really puzzles me, though.

It seems like the individual beliefs weren't cruxes. If you want to convince a person and find that the argument you are making doesn't hit cruxes, it's important to talk to them to understand what their cruxes are. 

Update: The goblin sack weighs in:

█████ has safety and socio██████ needs that aren't being fully met

needs to feel safe and so on and no rhetoric will change those emotions before or without situations changing

the more times you run a convincing proof against someone who can't emotionally accept it, the more you blunt the convincing proof on them

If I were snarky, I would respond that they find themselves in a state of psychological unsafety because they keep taking hubristic bets and pushing their friends away, regardless, it is still ultimately our problem to solve.

Another adds

so [mako's] post is implying that [rationalists] should be able to maslow people lmao

I do totally buy this. "Maslowing" is a term that arose at some point to mean: ensuring that enough of a person's basic selfish needs are met, that they can begin to think of other people, the rest of the world, or of loftier needs like self-actualization.

I resolve, this task of providing enough psychological safety to allow a person to admit when they were deeply, haplessly wrong (dependent on others for guidance and correction! How horrifying!), is a rationality technique, perhaps the most important rationality technique.

I find that narcissism is our most common adversary, especially in hyperpublic contexts like the global online discourse where narcissism is hard as fuck to resist, and runs in the water.

Narcissism is exactly a felt need to defend a delusional narrative of perfection.

It is a product of social incentives.

We will improve the incentives.

I was already going to respond simply that your friend believes these things because they want to believe them. They have to want to be rational.

As for me, I don't put rationality above all things, because I think it can be something you delude yourself into both idolising and thinking you're attaining; you can become something like a paperclip maximiser because you've convinced yourself it's logical. After having been something of a virulent atheist rationalist many years back, I realised that many of the people on my 'side' were in fact narrow-minded and often heartless gits, and some of the religious folk were warm, funny, and very open-minded; faith for them was more of a matter of how they wanted the world to be, a matter of aesthetics and drive.

So, basically, if your heart's not in the right place, who cares how rational/right you think you are? That certainly applies to your friend.

Possibly relevant and interesting.  Downvoted due to the feeling that information is intentionally being withheld in order to manipulate my reaction.

The feeling is misplaced. Information is intentionally being withheld to protect the privacy of our friend. The story only represents my perspective, after having paid only a limited amount of attention to things, and I would discourage anyone close to the situation from taking my characterization of it as conclusive.