Yes, I suppose I should. By a non-rational goal I meant a goal that was not necessarily to my benefit, or the benefit of the world, a goal with a negative net sum worth. Things like poisoning a reservoir or marrying someone who will make your life miserable.

You decided to try achieving that "non-rational" goal, so it must be to your benefit (at least, you must believe so).

An example that I usually give at this point is as follows. Is it physically possible that in the next 30 seconds I'll open the window and jump out? Can I do it? Since I don't want to do it, I won't do it, and therefore it can not happen in reality. The concept of trying to do something you'll never want to do is not in reality either.

1Nick_Tarleton11y"Not to my benefit" is ambiguous; I assume you mean working against other goals, like happiness or other people not dying. But since optimizing for one thing means not optimizing for others, every goal has this property relative to every other (for an ideal agent). Still, the concept seems very useful; any thoughts on how to formalize it?

The Costs of Rationality

by RobinHanson 1 min read3rd Mar 200981 comments

34


The word "rational" is overloaded with associations, so let me be clear: to me [here], more "rational" means better believing what is true, given one's limited info and analysis resources. 

Rationality certainly can have instrumental advantages.  There are plenty of situations where being more rational helps one achieve a wide range of goals.  In those situtations, "winnners", i.e., those who better achieve their goals, should tend to be more rational.  In such cases, we might even estimate someone's rationality by looking at his or her "residual" belief-mediated success, i.e., after explaining that success via other observable factors.

But note: we humans were designed in many ways not to be rational, because believing the truth often got in the way of achieving goals evolution had for us.  So it is important for everyone who intends to seek truth to clearly understand: rationality has costs, not only in time and effort to achieve it, but also in conflicts with other common goals.

Yes, rationality might help you win that game or argument, get promoted, or win her heart.  Or more rationality for you might hinder those outcomes.  If what you really want is love, respect, beauty, inspiration, meaning, satisfaction, or success, as commonly understood, we just cannot assure you that rationality is your best approach toward those ends.  In fact we often know it is not.

The truth may well be messy, ugly, or dispriting; knowing it make you less popular, loved, or successful.  These are actually pretty likely outcomes in many identifiable situations.  You may think you want to know the truth no matter what, but how sure can you really be of that?  Maybe you just like the heroic image of someone who wants the truth no matter what; or maybe you only really want to know the truth if it is the bright shining glory you hope for. 

Be warned; the truth just is what it is.  If just knowing the truth is not reward enough, perhaps you'd be better off not knowing.  Before you join us in this quixotic quest, ask yourself: do you really want to be generally rational, on all topics?  Or might you be better off limiting your rationality to the usual practical topics where rationality is respected and welcomed?

34