How do you notice when you're rationalizing? Like, what *actually* tips you off, in real life?
I've listed my cues below; please add your own (one idea per comment), and upvote the comments that you either: (a) use; or (b) will now try using.
I'll be using this list in a trial rationality seminar on Wednesday; it also sounds useful in general.
Cue: Any time my brain goes into "explaining" mode rather than "thinking" ("discovering") mode. These are rather distinct modes of mental activity and can be distinguished easily. "Explaining" is much more verbal and usually involves imagining a hypothetical audience, e.g. Anna Salamon or Less Wrong. When explaining I usually presume that my conclusion is correct and focus on optimizing the credibility and presentation of my arguments. "Actually thinking" is much more kinesthetic and "stressful" (in a not-particularly-negative sense of the word) and I feel a lot less certain about where I'm going. When in "explaining" mode (or, inversely, "skeptical" mode) my conceptual metaphors are also more visual: "I see where you're going with that, but..." or "I don't see how that is related to your earlier point about...". Explaining produces rationalizations by default but this is usually okay as the "rationalizations" are cached results from previous periods of "actually thinking"; of course, oftentimes it's introspectively unclear how much actual thought was put into reaching any given conclusion, and it's easy to assume that any conclusion previously reached by my brain must be correct.
Cue for noticing rationalization: I find my mouth responding with a "no" before stopping to think or draw breath.
(Example: Bob: "We shouldn't do question three this way; you only think so because you're a bad writer".
My mouth/brain: "No, we should definitely do question three this way! [because I totally don't want to think I'm a bad writer]"
Me: Wait, my mouth just moved without me being at all curious as to how question three will play out, nor about what Bob is seeing in question three. I should call an interrupt here.)
Cue: I find that I have multiple, independent reasons to support the same course of action. As with Policy Debates Should Not Appear One-Sided, there is every reason to expect multiple lines of evidence to converge on the truth, but there is no reason to expect multiple independent considerations to converge on the same course of action.
You can find multiple, independent considerations to support almost any course of action. The warning sign is when you don't find points against a course of action. There are almost always multiple points both for and against any course of action you may be considering.
When I can't explain my reasoning to other person without a feeling of guilt that I am slightly manipulating them.
I am better at noticing when I lie to other people, than when I lie to myself. Maybe it's because I have to fully verbalize my arguments, so it is easier to notice the weak parts that would otherwise be skipped. Maybe it's because when I talk to someone, I model how would they react if they had all the information I have, and this gives me an outside view.
When I feel relief that I did not have to change my point of view after thinking through something. (" representing said that , therefore I should continue supporting ")
Orthogonal: Noticing when you're rationalizing is good, but assuming that you might be rationalizing and devising a plan of information acquisition and analysis that is relatively resilient to rationalization is also a safe bet.
I noticed that there is a certain perfectly rational process that can feel a lot like rationalization from the inside:
Suppose I were to present you with plans for a perpetual motion machine. You would then engage in a process that looks a lot like rationalization to explain why my plan can't work as advertised.
This is of course perfectly rational since the probability that my proposal would actually work is tiny. However, this example does leave me wondering how to separate rationalization from rationality possibly with excessively strong priors.
The twinge of fear. When I came across Bruine de Bruin et al 2007 as a cite for the claim that sunk cost lead to bad real-world consequences (not the usual lab hypothetical questionnaires), I felt a twinge or sickness in my stomach - and realized that I had now bought thoroughly into my sunk cost essay and that I would have to do an extra-careful job reading that paper since I could no longer trust my default response.
Cue: The conclusion I'm investigating is unlikely to be correct, as in I feel that I couldn't have enough understanding to solve the problem yet, to single out this particular conclusion, and so any overly specific conclusion is suspect and not worth focusing on. It's the "sticky conclusion" effect: once considered, an idea wants to be defended.
Cue: I feel emotionally invested in a particular fact being true. I feel like a blue or a green. May be related to Anna's point about ugh fields.
Let's also try the converse problem: what cues tip you off, in real life, that you are actually thinking, and that there is actually something you're trying to figure out? Please stick such cues under this comment.
Cue for noticing rationalization: I notice that I want a particular conclusion, and that I don't actually anticipate updates from the considerations that I'm "thinking through" (if alone) or saying/hearing (if conversing).
Cue: I say the word "clearly", "hopefully", or "obviously".
This definitely doesn't always indicate rationalization, but I say one of those words (and probably some others that I haven't explicitly noticed) with much greater probability if I'm rationalizing than if I'm not, and saying them fairly reliably kicks off an "ORLY" process.
Cue: Noticing that I'm only trying to think of arguments for (or against, if i'm arguing with someone) some view.
Self-supplication is a strong indicator of rationalization in me. Phrases like, "at least", "can't I just", "when can I..." are so far always indicators that I'm trying to get myself to do something I know I shouldn't or stop doing something I know I should be doing.
Cue: I have an "ugh field" across part of my mental landscape -- it feels almost like literal tunnel vision, and is the exact same feeling as the "ugh field" I might get around an unpaid bill. (Except this time it's abstract; it descends out of nowhere while I'm having a conversation about whether we should hire so-and-so, or what SingInst strategy should be, or whatever).
Cue for noticing rationalization: My head feels tired and full of static after the conversation.
When I get into a particular negative emotional state, I make up reasons that I feel that way. When I start a sentence with a bitter "Fine!" or "I should have known better," it's a guarantee that the next statement out of my mouth ("you obviously don't care", "there's no point in cooking for you because you hate food," etc.) will be something I know to be false but that, if it were true, would explain why I feel rotten. Physically, the cue is me turning away from the person I'm speaking to. The actual explanation... (read more)
Cue: I feel low status. In low status mode, I feel I'm on the defensive and I have something to prove. In high status mode, I think there are two effects which help avoid rationalization:
Edit: ah, Vladimir_Nesov said something similar on the other thread
I notice I'm rationalizing when after I lose my first defensive argument, I have another lined up and ready to go. If that continues in succession (2 or 3), then I have to stop and rethink my position.
I have a thought in my head that tells me "I'm being irrational", and I try to shut it off, or not care about it. Usually this leads to an increased level of frustration and anger.
When in my mind I already realized I'll have to change my opinion, but I don't want to admit that right now, during the argument.
I think I perceive pattern in my speech, possibly a change of tone in my voice, certain pattern of thoughts (repetitive) and goals (convincing others/self).
Cue: my internal monologue contains the sequence "... which is obviously just a rationalization, but..." and then proceeds as if it were true.
I never notice. Either I don't do it, either I just don't see it.
But I see other people rationalizing a lot.
People persistently disagreeing with me is one sign.
The more numerous they are, the smarter they are, and the more coherent-sounding their arguments, the more likely it is that there's a problem at my end.
Of course, I often ignore such signals. For example, I still think that global warming is good, that our distant ancestors were probably clay minerals, that memetics is cool, that there will be a memetic takeover - and lots of other strange things.
Cue for rationalising: I feel like I'm 'rocking back' or 'rolling back down the bowl'. Hmm. Let me clarify.
A ball in a bowl will roll around when the bowl is shaken, but it goes up the side, reaches a zenith, and rolls back down. Similar for trying to scale a steep hill at a run; you go up, reach a zenith, come back down. And again for balance: you wobble in a direction, find a point of gaining balance, and return to center.
The cue is feeling that in a conversation, discussion, or argument. We sort of roll around discussing things, something comes up that ... (read more)
Cue: feeling of urgency, of having to come up with an counterargument to every possible criticism of my reasoning before someone raises it (I sometimes think by imagining a discussion with someone, so I don't have to necessarily anticipate actually talking about it for this feeling to come up).
Cue: Non-contingency of my arguments (such that the same argument could be applied to argue for conclusions which I disagree with).
The feeling that I'm getting into some sort of intellectual debt. Kind of like a little voice saying "you may say this fits with your understanding of the situation, but you know it doesn't really, and if it doesn't come back to bite you, it's because you're lucky, not because you're right. Remember that, bucko."
Quite why it uses the word "bucko" is beyond me. I never do.
Cue: my internal monologue (or actual voice) is saying one thing, and another part of my mind that doesn't speak in words is saying "bullshit".
Cue: An argument that is being advanced for or against a conclusion doesn't distinguish it from its alternatives (mostly notice in others).
Cue for noticing rationalization: In a live conversation, I notice that the time it takes to give the justification for a conclusion when prompted far exceeds the time it took to generate the conclusion to begin with.
Cue for noticing rationalization: I feel bored, and am reciting thoughts I've already though of.
Whenever I start to get angry and defensive, that's a sign that I'm probably rationalizing.
If I notice, I try to remind myself that humans have a hard time changing their minds when angry. Then I try to take myself out of the situation and calm down. Only then do I try to start gathering evidence to see if I was right or wrong.
My source on 'anger makes changing your mind harder' was 'How to Win Friends and Influence People'. I have not been able to find a psychology experiment to back me up on that, but it has seemed to work out for me in real life. It ... (read more)
For rationalizing an action - I realize that the action is something I really want to do anyway. And then I figure (right or wrong) that all the reasons I just thought up to do it are made up, and I try to think about why I want to do it instead.
For me, it's caring about means vs. ends.
Rationalizing: I'm thinking about how to justify doing something I'm already comfortable with; adhering to a plan or position I've already accepted. So I invent stories about how the results will be acceptable, that I couldn't change them anyway, that if they are bad it will have been someone else's fault and not mine, etc.
Not rationalizing: I'm thinking about the ends I'm trying to accomplish. I'm not committed to an existing plan or position. I'm curious about what plan will achieve the desired results. If the results turn out bad, I'll critique my own plan and change it (I suppose we call that lightness) rather than blaming someone else.
Like some other people have said, one of my biggest tip-offs is if I have a strong negative reaction to something. Often this happens if I'm reading a not-particularly-objective report, experiment, treatise or something which could have been written with strong biases. My mind tends to recoil at the obvious biases and wants to reject the entire thing, but then the rational part of me kicks in and forces me to read through the whole thing before parsing an emotional reaction. After all, a point of being rational is to be able to sieve through other writers'... (read more)
I feel really right about whatever it is, with many clever and eloquent arguments, but I also sorta know that I can't quite epistemologically justify it.
(Note that I may actually be right - that doesn't mean I wasn't just rationalising.)
Cue: Any time I decide that it wouldn't be worth my time to closely examine counterarguments or evidence against a live hypothesis. This normally takes the form of a superficially plausible cost/benefit analysis with a conclusion along the lines of "if you examined every single crackpot theory then you wouldn't have any time to look at things that mattered", which can largely only be justified by a rationalized working model of how mental accounting works.
The problem is not with 'rationalization'. Many mathematical proofs started with a [unfounded] belief that some conjecture is true, yet are perfectly valid as the belief has been 'rationalized' using solid logic.
The problem is faulty logic; if your logic is even a small bit off on every inference step, then you can steer the chain of reasoning towards any outcome. When you are using faulty logic and rationalizing, you are steering into some outcome that you want. When you are using faulty logic and you are actually thinking what is true, then you just acc... (read more)