Why *I* fail to act rationally

There is a lot of talk here about sophisticated rationality failures - priming, overconfidence, etc. etc. There is much less talk about what I think is the more common reason for people failing to act rationally in the real world - something that I think most people outside this community would agree is the most common rationality failure mode - acting emotionally (pjeby has just begun to discuss this, but I don't think it's the main thrust of his post...).

While there can be sound evolutionary reasons for having emotions (the thirst for revenge as a Doomsday Machine being the easiest to understand), and while we certainly don't want to succumb to the fallacy that rationalists are emotionless Spock-clones. I think overcoming (or at least being able to control) emotions would, for most people, be a more important first step to acting rationally than overcoming biases.

If I could avoid saying things I'll regret later when angry, avoid putting down colleagues through jealousy, avoid procrastinating because of laziness and avoid refusing to make correct decisions because of fear, I think this would do a lot more to make me into a winner than if I could figure out how to correctly calibrate my beliefs about trivia questions, or even get rid of my unwanted Implicit Associations.

So the question - do we have good techniques for preventing our emotions from making bad decisions for us? Something as simple as "count to ten before you say anything when angry" is useful if it works. Something as sophisticated as "become a Zen Master" is probably unattainable, but might at least point us in the right direction - and then there's everything in between.

18 comments, sorted by
magical algorithm
Highlighting new comments since Today at 3:13 AM
Select new highlight date

I'm afraid I'm repeating myself when I say this, but there is already a school of rationalism that discusses exactly this question, and it's called cognitive behavioural therapy, or CBT. As far as I can tell, CBT is exactly the process of using our capacity for rational introspection to improve our mental health, and it is the most empirically effective talking therapy there is.

It is worth repeating. CBT includes many powerful techniques for understanding why we have the automatic thoughts & reactions we do and how to change the ones that upon reflection interfere with achieving our goals. It addresses the original poster's concerns exactly.

ok, that's great. It sounds like 'something in the middle'..., so what should I do? I don't have any diagnosable pyschological illness, or even any problems a psychiatrist would be interested in, but I do sometimes have emotional reactions that I'd like to control.

Is CBT aimed specifically at helping people with pschological conditions? Or does it have useful elements that perfectly healthy people can use to help them get over perfeclty normal problems? And how can I find out about them?


It's the only empirically-effective talking therapy there is.

Depending on how the standards are set, it's also the only effective psychiatric intervention, period. Manipulating symptoms is nice but not nearly enough.

The jury's still out, but EMDR seems promising - it's questionable whether the eye movements are necessary, but it seems to perform as well as CBT.

Actually, in one study, TFT beat out EMDR, but then one of the researchers came up with a hypothesis to explain the effectiveness of TFT, EMDR, TIR, and the NLP V/KD technique... and designed something even better:

After the research study was over, there was much persuasive argument from each of the proponents of the brief therapy methods represented. In a later NLP workshop, Ed Reese challenged me to test the hypothesis of pattern destabilization. I proposed that any stimuli capable of affecting a perturbation in visual, auditory, and kinesthetic modes simultaneously would prove to be as effective in eliminating a traumatic experience as TFT, even without the use of their complex algorithms. The stimuli that I proposed to test the hypothesis with was a game readily found in all children's toy stores called Simon.

The research shows that not all of the efficacy of the drugs is down to the placebo effect.

Certainly drugs have effects. Whether the effects of the drugs are really a help is questionable.

There are a few conditions that people usually just can't cope with without drugs, even though the drugs have serious downsides. Lithium is a godsend for manic depression, despite it being quite dangerous - but considering how effectively repeatedly cycling is for people's lives, it's worth the risk.

This touches on what for me is one of the big open questions on what it means to act rationally. I question the common position that the kinds of 'irrational' decisions you describe are actually all that irrational. Many such decisions seem to be rational decisions for an agent with a high time preference at the moment of decision. They may seem irrational from the perspective of a future self who looks back on the decisions when dealing with the consequences but I see the problem as more one of conflicting interests between present selves and past/future selves than one strictly of rationality. As the recent post discussed, rationality doesn't provide goals, it only offers a system for achieving goals. Many apparently irrational decisions are I suspect rational responses to short term goals that conflict with longer term goals.

If I decide to eat a chocolate bar now to satisfy a current craving, I am not really acting irrationally. I have a powerful short term drive to eat chocolate and there is nothing irrational in my actions to satisfy that short term goal. Later on I may look at the scales and regret eating the chocolate but that reflects either a conflict between short term and long term goals or a conflict between the goals of my present self and my past self (really just alternative ways of looking at the same problem). It is not a failure of rationality in terms of short term decision making, it is a problem of incentives not aligning across time frames or between present and future selves. In order to find solutions to such dilemmas it seems more useful to look to micro-economics and the design of incentive structures that align incentives across time scales than to ways to improve the rationality of decisions. The steps I take to acquire chocolate are perfectly rational, the problem is with the conflicts in my incentive structure.

Many such decisions seem to be rational decisions for an agent with a high time preference at the moment of decision.

Emotions can also have a lower time preference than your conscious self. For example, a surge of anger can make you stand up against a bully and win much more than the present confrontation in long term self-respect and respect of others, even if you eventually "lose" this particular conflict. My subconscious is always tracking the intangible "social" terms of my long range utility function, and over the years I've come to appreciate that.

I'd describe that as a situation where your long-term interests and your very-short-term interests gang up on your short- to medium-term interests.

A great description, funny how it applies to other emotional acts such as cheating on your spouse (increase reproductive chances while risking comfort of family life). It might be enlightening to think of some emotions as optimizations for the very long term - for you and all your descendants (makes sense as emotions were created by evolution) - and the rational mind as optimizing for the short to medium term..

The irrational aspect is not that there is conflict between the short-term and long-term or that you act based on short-term consequences, but that when you think rationally while contemplating eating the chocolate bar about the short-term, medium-term, and long-term consequences, weighting them all appropriately, you still end up eating the chocolate bar even though at this very moment your careful consideration of the weighted factors led to the conclusion that not eating it is preferable.

There's a difference between correcting your behaviour to adapt to what you'd judge, intellectually to be rational, and correcting what generates your behaviour.

The same difference that exists between running a compiled or an interpreted program (or at least when interpreted meant it would run really slow).

The second may not be possible quite a few cases too. The first is probably possible, but unless it becomes an learned reflex, you'll always have to expand some mental energy towards it ( http://en.wikipedia.org/wiki/Ego_depletion ) . In the end, it may work against your objective of acting more in accordance with your best rational judgement.

The best idea I can come up, is, be honest with yourself. There's always a reason why you'd act against your judgement. What is that reason, and why is it stronger ?

Where does the strength that your rational decisions may possess, come from ? From your desire to be rational, from the belief it is going to be more efficient, from the expected better payoff that you'll attain by acting rationally as opposed to having no strategy, following your impulses more or less blindly ?

If that strength isn't enough, but you still "want" to act rationally, then maybe you could search inside of you for the correct feeling, the one you know you'd feel, if you were to have the rational reaction. You are angry at someone ? Why ? How would you feel if you weren't ? Can you step back, examine yourself as if you were a third party, and decide that after all the satisfaction of letting that anger go wouldn't be a great loss if you decided to experience a different feeling ? Or would you rather have that satisfaction after all ? Same for most other similar issues. There's a reason why you act in some way and not another. Find it, see if it's really worth it, see how else you could feel, pause for a moment, see if you could, and would want to, feel like that after all. Then decide for yourself.

I really like Matt's point that not all undesired behaviors are irrational. Rather they reflect conflicts of interest within yourself, at a single time or over different points in time. It makes sense that we would have conflicts since we are very complex systems trying to optimize several things simultaneously.

In a stereotype of rationality, rational people are seen without emotions or any physical senses, like computers or robots. Unlike computers and robots, though, people are human beings with organic bodies. I think it is a mistake to discount the importance of having physical bodies which place demands on our utility functions. Matt gave the example of wanting some carbs. My thesis in this comment is that perhaps all irrational behaviors, which are not due to faults in logic or incompletely considered information, is the "fault of" our physical bodies. Everyone knows that if we don't feel well, it changes everything. Many people can't think rationally if they're too hungry.

Discounting the broad category of undesired behaviors that are really examples of the conflicts of interest described by Matt, I asked myself what other times does emotion cause me to act irrationally? These would have to be examples when I behave in a way that I really don't prefer (i.e., not just due to a conflict of interest) but I am unable to make decisions in the way that I do prefer because of my emotions.

I can think of many, many examples! In these examples, my emotions hold sway and cause me to act in ways that I do not wish -- not even at that time. Then in these cases, is it not another example of the influence of a physical body? Perhaps you have a different view, but I think of emotions that I cannot control as being physically based. If I could just turn off the surge of hormones in my body, then I could behave normally and rationally.

I would be interested in ways (mind over matter? psychology? Cognitive behaviorial therapy as ciphergoth mentioned?) to have more control over these hormones when some control is needed.

Visualizing the consequences of whatever it is I'm trying not to do usually works for me.