I believe I agree with the intuition. Does it say anything about a problem like the above, though? Does the villain decide not to poison the hero, because the hero would not open the box even if the villain decided to poison the hero? Or does the hero decide to open the box, because the villain would poison the hero even if the hero decided not to open the box? Is there a symmetry-breaker here? -- Do we get a mixed strategy à la the Nash equilibrium for Rock-Paper-Scissors, where each player makes each choice with 50% probability?

(I'm assuming we're assumi... (read more)

I'm not sure why I'm getting downmodded into oblivion here. I'll go out on a limb and assume that I was being incomprehensible, even though I'll be digging myself in deeper if that wasn't the reason...

In classical game theory (subgame-perfect equilibrium), if you eat my chocolate, it is not rational for me to tweak your nose in retaliation at cost to myself. But if I can first commit myself to tweaking your nose if you eat my chocolate, it is no longer rational for you to eat it. But, if you can even earlier commit to definitely eating my chocolate even if... (read more)

The Costs of Rationality

by RobinHanson 1 min read3rd Mar 200981 comments


The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources. 

Rationality certainly can have instrumental advantages.  There are plenty of situations where being more rational helps one achieve a wide range of goals.  In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational.  In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.

But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us.  So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.

Yes, rationality might help you win that game or argument, get promoted, or win her heart.  Or more rationality for you might hinder those outcomes.  If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends.  In fact we often know it is not.

The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful.  These are actually pretty likely outcomes in many identifiable situations.  You may think you want to know the truth no matter what, but how sure can you really be of that?  Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for. 

Be warned; the truth just is what it is.  If just knowing the truth is not reward enough, perhaps you'd be better off not knowing.  Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics?  Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?