Pseudo-rationality is the social performance of rationality, as opposed to actual rationality. Here are some examples:

  • Being overly skeptical to demonstrate how skeptical you are
  • Always fighting for the truth, even when you’re burning more social capital than the argument is worth
  • Optimising for charitability in discussions to the point where you are consistently being exploited
  • Refusing to do any social signalling or ever bow to social norms to signal that you're above them
  • Spending too much time reading rationality content or the kinds of things rationalists are interested in
  • Adopting techniques like pomodoros or TAPs merely because all the cool (rationalist) kids are using them, instead of asking if they are really helping you
  • Hating things like post-modernism because other rationalists hate them and not because you've actually thought about it for yourself (but yes, post-modernism is mostly incoherent)
  • Over-analysing unimportant decisions so that you can prove you made the rational decision

Why does this happen? Status and social norms distort the way we see the world. Even if it doesn't fool everyone, it will fool some people. Or if it fools no-one, you'll at least fool yourself. Here are some thought patterns:

  • All the other rationalists think I’m a good rationalist, surely I must be (all social incentive systems have loopholes)
  • All the other rationalists do this, so it must be rational (can be applied even if you are doing it to a much higher degree)
  • I am so much more rational than those other people who are wrong/bow to social norms/aren’t at all skeptical (more rational does not equal rational)

Why did I write this post? Well, it seems the next thing you need after becoming a rationalist, is something to help you figure out if you're doing any of it wrong. I hope this helps, but let me know if I should add anything else to the list.

Reflection based on comments:

Where this gets complex is when you desire the successful social performance of rationality as a goal that holds up after reflection. Some people may value this to a level that seems excessive to most people and so may not be acting irrationally. More generally, it seems that every rationalist should value successfully performing rationality to some degree, even if only instrumentally. These considerations complicate discussions of what is or is not pseudo-rational, but do not invalidate the general concept as most often they are not in line with someone's considered values. Further, this concept has utility as identifying a pattern of behaviour that we might want to discourage as a community.

Footnotes:

This is very similar to Straw Vulcans except that Straw Vulcans are about how the media represents being logical/rational, while pseudo-rationality is broader and includes misconceptions that may not be prevalent in the media. Another difference is that Straw Vulcans are about defending rationality/logic from being straw-manned, while pseudo-rationality is encouraging rationalitists to consider whether they are really as rational as they think they are.

Also see: Mythic values vs. folk values. Pseudo-rationality is very similar to folk values, pseudo-rationality is not about impressing other people, but about fooling yourself.

New Comment
18 comments, sorted by Click to highlight new comments since: Today at 9:57 PM

While it can be good to look out for these things in yourself, I am against ever pointing this out in other people's behaviour (not that you suggested doing this). Always stick to the object level, and only ever give feedback of that sort to people you have a high-trust relationship with, and in private.

A nearby post to this that I'd be very happy to read would be you explaining how you'd made a particular mistake of this sort, and how you noticed it.

Yeah, I should have worded that more carefully. It now says: "while pseudo-rationality is encouraging rationalitists to consider whether they are really as rational as they think they are"

"Spending too much time reading rationality content or the kinds of things rationalists are interested in" - this is probably the one that I have the biggest struggle with.

"Always fighting for the truth, even when you’re burning more social capital than the argument is worth" - another one that I've done way too much!

Over-analysing unimportant decisions can make you better at analysing. If you want to learn how to use a hammer well, it can be useful to solve all kinds of problems with the hammer even if the hammer isn't the best tool to solve them.

By your logic you could call Bezos decision to have a desk made from a door pseudo-frugality. That doesn't change the fact that making decisions like how he become one of the richest people on our planet.

Fighting for the truth, even when you’re burning more social capital than the argument is worth is a symbolic act that shows that you value truth. This means that you shift the cultural norm in the direction of valuing truth. It also shifts your own norms in being more truthful and is going to make you more likely to be focused on truth in other cases when it actually matters.

There's nothing irrational at valuing symbolic value of an act at higher than zero.

I suppose there are some game theory considerations here. If people can silence you by costing you more than a certain amount of utility, then they have an incentive to hurt you when you say something that they don't like.

And I also agree that there is value in building the virtue of truthfulness. And that the symbolic act may inspire more than if you were just being pragmatic.

Hmm... but at the same time, I don't think that social forces can be completely ignored. They are just too powerful. Always fighting for the truth will significantly compromise your ability to achieve your other objectives. I guess I'm more against uncritical acceptance of this more than anything. Perhaps some people will find that their comparative advantage actually is being a public figure who is always forthright, though I suspect that is a rather small proportion of the population.

(Adding more detail: It's actually much easier for people who aren't as good at thinking for themselves to declare themselves in favour of always saying the truth no matter the cost, because they don't have any truly controversial or damaging views outside of their social group. Or if they do, hypocrisy + cognitive dissonance can solve that problem. True rationalists don't have this out, so the cost is much higher for them).

Hmm... but at the same time, I don't think that social forces can be completely ignored. They are just too powerful.

I don't think doing something for it's cultural meaning is ignoring social forces. Saying things shouldn't be done for their cultural meaning looks to me much more like ignoring social forces.

Commitments to strategies and cultural values an be useful.

On the personal level having clear values helps reduce akrasia. On the organisational level cultural values lead organisations to use shared heuristics for decision making.

If a new employee sees the Amazon door desk and ask other employees about it, they will get a speech about frugality and see that Amazon is serious about frugality.

Making symbolic decisions like that is best practice of how startups create company culture that's more than a strategy document that nobody reads.

It's actually much easier for people who aren't as good at thinking for themselves to declare themselves in favour of always saying the truth no matter the cost

That wasn't what we were talking about. We weren't talking about declaring being in favor of truth but about actually fighting it.

There are people who profess to have values and that don't follow those values. Seeing being skeptical as a value and then acting skeptical is a simple expression of living one's values.

We can argue about whether being skeptical or fighting for truth are good values to have and that might be different for different people but there's no a priori reason for arguing that holding either of those values isn't rational.

I think you miss the point entirely with justifying some of these actions as wrong based on your own set of values instead of based on the goals and values of the person doing them. For example:

  • Being overly skeptical to demonstrate how skeptical you are

The person doing this values social signalling of skepticism more than efficiency in this particular matter

  • Always fighting for the truth, even when you’re burning more social capital than the argument is worth

The person doing this values truth more than social capital, or values the argument more than the lost social capital

  • Optimising for charitability in discussions to the point where you are consistently being exploited

The person doing this values the outcomes of being seen as consistenty charitable more than the effect of being exploited on occasion

  • Refusing to do any social signalling or ever bow to social norms to signal that you're above them

The person doing this is not doing it for signalling purposes, but because the effort to comply with social norms or signalling would take away from the spoons they have to do other things that are more important to them.

It seems to be that the "pseudo-rational" trap one should actually avoid is to apply one's own goals/values/utility functions to other people by default.

Thanks, this is an excellent reply. Sometimes you will find that someone's goals truly are something different from what you might expect. But more often, I suspect that people aren't fully aware of the trade-off or haven't fully embraced it. I mean, so much of what we do is just because it feels right and not because we've consciously considered it and come to a decision. And, maybe I should have spent some time acknowledging this possibility in my post, because I definitely simplify the situation somewhat. I guess the point of this post is to be somewhat provocative, to point out ways someone might be acting that might not match their true values (here I mean values that hold up under reflection).

Back in 2014, I posted some tweets with the #skepticfailuremodes hashtag that are similar to your examples of "psuedo rationality", some of which were:

  1. Thinking it is virtuous to assign low probabilities, forgetting this implies high probabilities for the negation.
  2. Ignoring an argument after finding a deductive fallacy, even when it can be easily restated as an strong inductive one.
  3. Increasing evidential standards for out-group claims; decreasing it for in-group claims.

I think pseudo-rationality is still better than no rationality, and it's not obviously possible for someone to move to "real" rationality without pseudo-rationality as an intermediate step.

Rationality is explicitly designed to be the kind of thing you can benefit from even if you're not pure of heart, and that you can't reliably fake to other people without actually doing. Also, you cannot simply make yourself have the right intentions if you don't already have them.

This part of HPMOR comes to mind:

And it was also clear that Potter was brilliant, and a whole lot more than just slightly mad, and playing a vast game that Potter himself mostly didn't understand, improvised at top speed with the subtlety of a rampaging nundu. But Potter had managed to choose a tactic that Draco couldn't just walk away from. He had offered Draco a part of his own power, gambling that Draco couldn't use it without becoming more like him. His father had called this an advanced technique, and had warned Draco that it often didn't work.

The point is that there are Slytherins in the world who want to play the Rationalist game for Slytherin reasons, and that it's futile to try fixing Slytherins by simply telling them Slytherin values are bad. The whole Rationalist game is designed so that the act of playing the game itself will corrupt even Slytherins towards the Light Side.

So I want to absolutely encourage any (fellow) Slytherins reading to play the Rationalist game as much as you like for whatever reasons you have. I've yet to see anyone play the game at all successfully without a noticeable improvement in the quality of their soul.

I think pseudo-rationality is still better than no rationality, and it's not obviously possible for someone to move to "real" rationality without pseudo-rationality as an intermediate step.

I'm rather doubtful of this claim as presented. I agree that there is a you're-worse-before-you're-better effect involved in taking up any new skill, but why shoud we expect it to take the form of pseudo-rationality presented here. To me much of the behaviors of pseudo-rationality look like what happens if you try to mix in rationality to a person who already exhibits certain behaviors. Not to say that pseudo-rationality is not common, especially among actual rationalists, but that someone coming to rationality from a different background than that of the standard rationalist would probably take different sorts of missteps.

But maybe that's what you meant and I'm just objecting to your phrasing that makes it sound as if pseudo-rationality is a propery someone might take up rather than a set of behaviors we observe them having.

I think I was confused and wanted to pick out that this notion of "pseudo-rationality" actually is an entangling of at least two distinct behaviors that I would treat completely separately, a la geeks, mops, sociopaths:

  1. "Sociopaths" trying to game the rationalist social ladder for power without actually learning rationality.
  2. "MOPs" trying to get as much out of rationality as possible with minimal effort and critical thinking.

I think I was mostly responding to the first set of things.

This article reminds me of "Uncritical Supercriticality", where people argue in favor of "rationality" a little too hard. Could also be either an innocent mistake, or it can be done strategically for social reasons. (If it's the latter, you are likely to put "rational" in the name of your website, because that gets even more social points.)

After writing this, not sure if I endorse this whole sentiment. To elaborate: it sounds to me like "pseudo-rationality" is just being bad at rationality, and if people really wanted to optimize for social status in the rationality community there is one easiest canonical way to do this: get good at rationality. So there's only a second-order difference between optimizing for pseudo-rationality and optimizing for rationality, and your post sort of just sounds like criticizing people for being bad rationalists in an unproductive tone.

There's a flavor of pseudo-rationality which is about optimizing for social approval from other pseudo-rationalists, e.g. trying to write LW posts by mimicking Eliezer's writing style or similar.

if people really wanted to optimize for social status in the rationality community there is one easiest canonical way to do this: get good at rationality.

I think this is false: even if your final goal is to optimize for social status in the community, real rationality would still force you to locally give it up because of convergent instrumental goals. There is in fact a significant first order difference.

Can you elaborate on this? I have the feeling that I agree now but I'm not certain what I'm agreeing with.

One example is that the top tiers of the community are in fact composed largely of people who directly care about doing good things for the world, and this (surprise!) comes together with being extremely good at telling who's faking it. So in fact you won't be socially respected above a certain level until you optimize hard for altruistic goals.

Another example is that whatever your goals are, in the long run you'll do better if you first become smart, rich, knowledgeable about AI, sign up for cryonics, prevent the world from ending etc.

If you define "rationality" as having good meta level cognitive processes for carving the future into a narrower set of possibilities in alignment with your goals, then what you've described is simply a set of relatively poor heuristics for one specific set of goals, namely, the gaining of social status and approval. One can have that particular set of goals and still be a relatively good rationalist. Of course, where do you draw the line between "pseudo" and "actual" given that we are all utilizing cognitive heuristics to some degree? I see the line being drawn as sort of arbitrary.