You are viewing a version of this post published on the 16th Mar 2009. This link will always display the most recent version of the post.

    We mean:

    1. Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory.  The art of obtaining beliefs that correspond to reality as closely as possible.  This correspondence is commonly termed "truth" or "accuracy", and we're happy to call it that.
    2. Instrumental rationality: achieving your values.  Not necessarily "your values" in the sense of being selfish values or unshared values: "your values" means anything you care about.  The art of choosing actions that steer the future toward outcomes ranked higher in your preferences.  On LW we sometimes refer to this as "winning".

    If that seems like a perfectly good definition, you can stop reading here; otherwise continue.

    Sometimes experimental psychologists uncover human reasoning that seems very strange - for example, someone rates the probability "Bill plays jazz" as less than the probability "Bill is an accountant who plays jazz".  This seems like an odd judgment, since any particular jazz-playing accountant is obviously a jazz player.  But to what higher vantage point do we appeal in saying that the judgment is wrong?

    Experimental psychologists use two gold standards: probability theory, and decision theory.  Since it is a universal law of probability theory that P(A) ≥ P(A & B), the judgment P("Bill plays jazz") < P("Bill plays jazz" & "Bill is accountant") is labeled incorrect.

    To keep it technical, you would say that this probability judgment is non-Bayesian.  Beliefs that conform to a coherent probability distribution, and decisions that maximize the probabilistic expectation of a coherent utility function, are called "Bayesian".

    This does not quite exhaust the problem of what is meant in practice by "rationality", for two major reasons:

    First, the Bayesian formalisms in their full form are computationally intractable on most real-world problems.  No one can actually calculate and obey the math, any more than you can predict the stock market by calculating the movements of quarks.

    This is why we have a whole site called "Less Wrong", rather than simply stating the formal axioms and being done.  There's a whole further art to finding the truth and accomplishing value from inside a human mind: we have to learn our own flaws, overcome our biases, prevent ourselves from self-deceiving, get ourselves into good emotional shape to confront the truth and do what needs doing, etcetera etcetera and so on.

    Second, sometimes the meaning of the math itself is called into question.  The exact rules of probability theory are called into question by e.g. anthropic problems in which the number of observers is uncertain.  The exact rules of decision theory are called into question by e.g. Newcomblike problems in which other agents may predict your decision before it happens.

    In cases like these, it is futile to try to settle the problem by coming up with some new definition of the word "rational", and saying, "Therefore my preferred answer, by definition, is what is meant by the word 'rational'."  This simply begs the question of why anyone should pay attention to your definition.  We aren't interested in probability theory because it is the holy word handed down from Laplace.  We're interested in Bayesian-style belief-updating (with Occam priors) because we expect that this style of thinking gets us systematically closer to, you know, accuracy, the map that reflects the territory.  (More on the futility of arguing "by definition" here and here.)

    And then there are questions of "How to think" that seem not quite answered by either probability theory or decision theory - like the question of how to feel about the truth once we have it.  Here again, trying to define "rationality" a particular way doesn't support an answer, merely presume it.

    From the Twelve Virtues of Rationality:

    How can you improve your conception of rationality?  Not by saying to yourself, “It is my duty to be rational.”  By this you only enshrine your mistaken conception.  Perhaps your conception of rationality is that it is rational to believe the words of the Great Teacher, and the Great Teacher says, “The sky is green,” and you look up at the sky and see blue.  If you think:  “It may look like the sky is blue, but rationality is to believe the words of the Great Teacher,” you lose a chance to discover your mistake.

    Do not ask whether it is “the Way” to do this or that.  Ask whether the sky is blue or green.  If you speak overmuch of the Way you will not attain it.

    You may try to name the highest principle with names such as “the map that reflects the territory” or “experience of success and failure” or “Bayesian decision theory”.  But perhaps you describe incorrectly the nameless virtue.  How will you discover your mistake?  Not by comparing your description to itself, but by comparing it to that which you did not name.

    We are not here to argue the meaning of a word, not even if that word is "rationality".  The point of attaching sequences of letters to particular concepts is to let two people communicate - to help transport thoughts from one mind to another.  You cannot change reality, or prove the thought, by manipulating which meanings go with which words.

    So if you understand what concept we are generally getting at with this word "rationality", and with the sub-terms "epistemic rationality" and "instrumental rationality", we have communicated: we have accomplished everything there is to accomplish by talking about how to define "rationality".  What's left to discuss is not what meaning to attach to the syllables "ra-tio-na-li-ty"; what's left to discuss is what is a good way to think.

    With that said, you should be aware that many of us will regard as controversial - at the very least - any construal of "rationality" that makes it non-normative:

    For example, if you say, "The rational belief is X, but the true belief is Y" then you are probably using the word "rational" in a way that means something other than what most of us have in mind.  (E.g. some of us expect "rationality" to be consistent under reflection - "rationally" looking at the evidence, and "rationally" considering how your mind processes the evidence, shouldn't lead to two different conclusions.)  Similarly, if you find yourself saying "The rational thing to do is X, but the right thing to do is Y" then you are almost certainly using one of the words "rational" or "right" in a way that a huge chunk of readers won't agree with.

    In this case - or in any other case where controversy threatens - you should substitute more specific language:  "The self-benefiting thing to do is to run away, but I hope I would at least try to drag the girl off the railroad tracks" or "Causal decision theory as usually formulated says you should two-box on Newcomb's Problem, but I'd rather have a million dollars."

    "X is rational!" is usually just a more strident way of saying "I think X is true" or "I think X is good".  So why have an additional word for "rational" as well as "true" and "good"?  Because we want to talk about systematic methods for obtaining truth and winning.

    The word "rational" has potential pitfalls, but there are plenty of non-borderline cases where "rational" works fine to communicate what one is getting at, likewise "irrational".  In these cases we're not afraid to use it.

    Yet one should also be careful not to overuse that word.  One receives no points merely for pronouncing it loudly.  If you speak overmuch of the Way you will not attain it.

    New to LessWrong?

    New Comment
    17 comments, sorted by Click to highlight new comments since: Today at 2:18 PM

    Note: this post originally appeared in a context without comments on Overcoming Bias. Old comments on this post are over here.

    How should we deal with the cases when epistemic rationality contradicts instrumental? For example, we may  want to use placebo effect because one of our values is that healthy is better than sick, and less pain is better than more pain. But placebo effect is based on the fact that we believe pill to be a working medicine that is wrong. Is there any way to satisfy both epistemic and instrumental rationality?

    It depends from case to case, I would think.  There are instances when you're most probably benefited by trading off epistemic rationality for instrumental, but in cases where it's too chaotic to get a good estimate and the tradeoff seems close to equal, I would personally err on the side of epistemic rationality.  Brains are complicated, forcing a placebo effect might have ripple effects across your psyche like an increased tendency to shut down that voice in your head that talks when you know your belief is wrong on some level (very speculative example), for limited short-term gain.

    Thank you, wonderful series!

    It seems to me that this is not a contradiction of two rationalities. Rather, it is similar to the resonance of doubt. If a placebo works when you believe in it, that means that if you believe in it, it will be true. Here you need a reverse example, when if you believe that something is true, then it becomes false. (Believing that something is safe again won't work, since you just need to not act more carelessly based on the safety of something, which is just a matter of instrumental rationality)

    [-]MP2y10

    If you believe that the placebo works, it works. You're right in believing it works.
    If you don't believe that the placebo works, it doesn't work. You're right believing it doesn't work

    If you believe that the sky is blue, you're right.
    If you believe that the sky is green, it's still blue, you're wrong.

    Truths that have humans involve some amounts of reflexivity. 

    I'd say you shouldn't force yourself to believe something (epistemic rationality) to achieve a goal (instrumental rationality). This is because, in my view, human minds are addicted to feeling consistent, so it'd be very difficult (i.e., resource expensive) to believe a drug works when you know it doesn't.

    What does it even mean to believe something is true when you know it's false? I don't know. Whatever it means, it'd have to be a psychological thing rather than an epistemological one. My personal recommendation is to only believe things that are true. This is because the modern environment we live in generally benefits rational behavior based on knowledge anyway, so the problem doesn't need to surface.

    The essay reminds me of the book 𝑳𝒂𝒏𝒈𝒖𝒂𝒈𝒆 𝒐𝒏 𝑻𝒉𝒐𝒖𝒈𝒉𝒕 𝒂𝒏𝒅 𝑨𝒄𝒕𝒊𝒐𝒏 by Samuel Hayakawa. The author also used the map and territory metaphor in the book.

    Eliezer has elsewhere mentioned it as having been an influence in his youth. The saying "the map is not the territory" originated with Korzybski, and Hayakawa's book is a popularisation of his work.

    Thank you for the reference. I just stumbled into this website and found the essays interesting to me. As a Chinese reader there is not so many this kind of contents in chinese web. Really lucky to enjoy the thought while improving my English.

    Welcome! There's a monthly open thread where newcomers are invited to introduce themselves.

    you should substitute more specific language in place of “rational”: “The self-benefiting thing to do is to run away, but I hope I would at least try to drag the child off the railroad tracks,”

    Wouldn't it be correct to say that it would be 'instrumentally rational' to run away in this case? It sounds rational to me, as far as you 'winning' means you 'surviving'.

    Is the last sentence rational?

    The one that says "If you speak overmuch of the Way, you will not attain it."

    This is a reference to Taoism (the tao = the Way). I believe it is a different approach to the tenet I've heard expressed as "The Tao that can be explained is not the true Tao". I believe the reference is meant to remind us that the point here is to end up performing less wrong rational thinking, not just talking about it.

    great post, just wanted to point out a typo here: "I cant quote the equations of General Relativity from memory, but nonetheless if I walk off a cliff, Ill fall. "

    it should be "I'll fall". good work otherwise.

    (Fixed, thank you!)

    Nice discussion. Thanks for putting this together. I learned something about Epistemic rationality vs Instrumental rationality.

    The bit about the sky being blue or green seems to beg the question of a justification for objective truth as championed by St Augustine and Leibniz as opposed to arguments for subjective reality as championed by the Cynics and Skeptics and, more recently, the Frankfort school. One could make the case that the sky appears green to one but blue to another.

    This topic comes up in many places throughout the history of thought. I'm actually working on a post for my blog exploring that at www.SimplyUrban.Org.