I usually define "rationality" as accuracy-seeking whenever decisional considerations do not enter. These days I sometimes also use the phrase "epistemic rationality".

It would indeed be more complicated if we began conducting the meta-argument that (a) an ideal Bayesian not faced with various vengeful gods inspecting its algorithm should not decide to rewrite its memories to something calibrated away from what it originally believed to be accurate, or that (b) human beings ought to seek accuracy in a life well-lived according to goals ... (read more)

You started off using the word "rationality" on this blog/forum, and though I had misgivings, I tried to continue with your language. But most of the discussion of this post seems to be distracted by my having tried to clarify that in the introductory sentence. I predict we won't be able to get past this, and so from now on I will revert to my usual policy of avoiding overloaded words like "rationality."

0timtyler11yIf truth is a bad idea, it's not clear what the reader is doing on Less Wrong [...] Believing the truth is usually a good idea - for real organisms. However, I don't think rationality should be defined in terms of truth seeking. For one thing, that is not particularly conventional usage. For another, it seems like a rather arbitrary goal. What if a Buddhist claims that rational behaviour typically involves meditating until you reach nirvana. On what grounds would that claim be dismissed? That seems to me to be an equally biologically realistic goal. I think that convention has it right here - the details of the goal are irrelevances to rationality which should be factored right out of the equation. You can rationally pursue any goal - without any exceptions.

The Costs of Rationality

by RobinHanson 1 min read3rd Mar 200981 comments

34


The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources. 

Rationality certainly can have instrumental advantages.  There are plenty of situations where being more rational helps one achieve a wide range of goals.  In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational.  In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.

But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us.  So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.

Yes, rationality might help you win that game or argument, get promoted, or win her heart.  Or more rationality for you might hinder those outcomes.  If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends.  In fact we often know it is not.

The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful.  These are actually pretty likely outcomes in many identifiable situations.  You may think you want to know the truth no matter what, but how sure can you really be of that?  Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for. 

Be warned; the truth just is what it is.  If just knowing the truth is not reward enough, perhaps you'd be better off not knowing.  Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics?  Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?

34