I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.

My definition of truth-seeking is basically Robin's definition of "rational." I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?

I like to distinguish information-theoretic rationality from decision-theoretic rationality. (But these are rather long terms.) Often on this blog it's unclear which is meant (although you and Robin did make it clear.)

7mark_spottswood11yPwno said: I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example? -------------------------------------------------------------------------------- The classic example would invoke the placebo effect. Believing that medical care is likely to be successful can actually make it more successful; believing that it is likely to fail might vitiate the placebo effect. So, if you are taking a treatment with the goal of getting better, and that treatment is not very good (but it is the best available option), then it is better from a rationalist goal-seeking perspective to have an incorrectly high assessment of the treatment's possibility of success. This generalizes more broadly to other areas of life where confidence is key. When dating, or going to a job interview, confidence can sometimes make the difference between success and failure. So it can pay, in such scenarios, to be wrong (so long as you are wrong in the right way). It turns out that we are, in fact, generally optimized to make precisely this mistake. Far more people think they are above average in most domains than hold the opposite view. Likewise, people regularly place a high degree of trust in treatments with a very low probability of success, and we have many social mechanisms that try and encourage such behavior. It might be "irrational" under your usage to try and help these people form more accurate beliefs.
5timtyler11yWell, sure. Repeating other posts - but one of the most common examples is when an agent's beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist - and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either - since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.

The Costs of Rationality

by RobinHanson 1 min read3rd Mar 200981 comments

34


The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources. 

Rationality certainly can have instrumental advantages.  There are plenty of situations where being more rational helps one achieve a wide range of goals.  In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational.  In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.

But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us.  So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.

Yes, rationality might help you win that game or argument, get promoted, or win her heart.  Or more rationality for you might hinder those outcomes.  If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends.  In fact we often know it is not.

The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful.  These are actually pretty likely outcomes in many identifiable situations.  You may think you want to know the truth no matter what, but how sure can you really be of that?  Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for. 

Be warned; the truth just is what it is.  If just knowing the truth is not reward enough, perhaps you'd be better off not knowing.  Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics?  Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?

34