I would like to suggest that we try to come up with several defintions of rationality. I don't feel we have exhausted this search area by any means. Robin has suggested, "More "rational" means better believing what is true, given one's limited info and analysis resources". Other commenters have emphasised goal-directed behaviour as a necessary ingredience of rationality. I think these defintions miss out on several important ingrediences - such as the social nature of rationality. There is also a subtext which argues - that rationality only gives one (correct) answer even if we only can approximate it. I feel strongly that rationality can give several correct answers and thus imagination is an ingredience of rationality. So without in any way believing that I have found the one correct defintion, I propose the following: When two or more brains try to be sensible about things and expand their agency. I believe that "sensible" in this context does not need to be defined as it is a primitive and each player willl submit their own meaning.

Maybe this is a can of worms - but are there other suggestions or defintions for rationality we can apply in our lives?

New Comment
14 comments, sorted by Click to highlight new comments since: Today at 4:30 AM

I'd rephrase this to emphasize the nonboolean nature of belief: an aspiring rationalist should seek to make his degree of belief in a proposition correspond to the strength of the evidence. It is also an error to have excessive confidence in a proposition that is most likely true.

I guess you are implying that using fuzzy logic can be considered rational.

  1. I think that it means also being compliant with the inference rules
  2. On a colloquial tone it might mean that it has the expected result a certain process

But I think it is irrational to try to be completely rational given that there is still a lot we do not know, therefore obscure/weird things appear/happen.

The only problem I have with Robin's definition (more "rational" means better believing what is true, given one's limited info and analysis resources. ) is that it doesn't make a point to distinguish between irrationality and other forms of stupidity.

I wouldn't call someone irrational if they dropped a sign in a calculation, or were simply not intelligent enough to understand how to calculate the answer, but if someone correctly calculates the the optimal trajectory, then takes a different route because "faith" tells him to, I would call that irrational.

My concept of rationality fits better to the idea of skillfully choosing which "Rituals Of Cognition" to trust. To put it another way, someone is rational to the extent that their preferred rituals of cognition "win" at the game of believing what is true (even if they manage to fail at implementing their "ROC"- that just makes them stupid/error prone).

The "given one's limited analysis resources" clause seems to cover some of this, but only vaguely, and would seem to give someone "rationality points" for coming up with a better algorithm that requires less clock cycles, while I would just give them "cleverness points". If one counts "non rationality intelligence" as a limited resource, then Robin's definition seems to agree, but "intelligence" is not very well defined either so defining "rationality" in terms of "intelligence" won't help us nail it down concretely.

Does anyone else have any thoughts on the difference between "irrational" and "other stupidity"?

I disagree... I think "limited analysis resources" accounts for the very difference you speak of. I think the "rituals of cognition" you mention are themselves subjection to rationality analysis: if I'm understanding you correctly, you are talking about someone who knows how to be rational in theory but cannot implement such theory in practice. I think you run into three possibilities there.

One, the person has insufficient analytical resources to translate their theory into action, which Robin accounts for. The person is still rational, given their budget constraint.

Two, the person could gain the ability to make the proper translation, but the costs of doing so are so high that the person is better off with the occasional translation error. The person rationally chooses not to learn better translation techniques.

Three, the person systematically makes mistakes in the translations. That, I think, we can fairly call a bias, which is what we're trying to avoid here. The person is acting irrationally - if there is a predictable bias, it should have been corrected for.

On your last point: "[Robin would] give someone "rationality points" for coming up with a better algorithm that requires less clock cycles, while I would just give them "cleverness points"." I think I have to side with Robin here. On certain issues it might not matter how quickly or efficiently the rational result is arrived at, but I think in almost all situations coming up with a faster way to arrive at a rational result is more rational, since individuals face constraints of time and resources. While the faster algorithm isn't more rational on a single, isolated issue [assuming they both lead to the same rational result], the person would be able to move on to a different issue faster and thus have more resources available to be more rational in a different setting.

[-]jmd15y20

Let me propose a dichotomy between two kinds of possible definitions of rationality (A restatment of Cameron Taylor's idea):

1) A positive quality of a step in the decision making process.

2) The efficiency of a decision making blackbox. Its expected utility.

I think that accurate definitions of the first type are difficult to give and in scientific practice difficult to use, for two reasons:

-Because they presupose a model of the thinking process of the agent.

-Because they (usually?) presupose a way to judge the quality of intermediate states of the decision making process of the agent.

Evolutionnary biologists, for example, use the concept of fitness which clearly has a definition of the second kind.

While trying to define rationality, it is very tempting to describe how to be rational. Since being rational is so difficult, resisting this temptation could be beneficial. Is there a definition of the first kind which doesn't say too much about how to be rational?

[-][anonymous]15y20

###Does rationality only give one answer?

There is also a subtext which argues - that rationality only gives one (correct) answer even if we only can approximate it.

The map is not the territory. The territory is the breathtaking mishmash of probabilities and quantum states and who knows what that is the universe that we (kind of sorta) know and love. The map is the combined representation of all the evidence we have of how of the universe that we use to approximate our environment and navigate our world.

Each agent will have a different map of the universe and so may give a different answer to any particular question.

###Is it 'Rational' being Right or is it rational to using your available evidence the efficiently? I would describe as "most rational" the agent who gives the answer that matches most closely the evidence that is available to him, as opposed to the agent that gives the best answer. I don't have a word for what is to have a better answer to a question despite the fact that your royally f* up the thinking and really should have been even more ahead in accuracy than you were.

I don't have a word for what is to have a better answer to a question despite the fact that your royally f* up the thinking and really should have been even more ahead in accuracy than you were.

(Epistemically) lucky.

Arguing over definitions is pointless, and somewhat dangerous. If we define the word "rational" in some sort of site-specific way, we risk confusing outsiders who come here and who haven't read the prior threads.

Use the word "rational" or "rationality" whenever the difference between its possible senses does not matter. When the difference matters, just use more specific terminology.

General rule: When terms are confusing, it is better to use different terms than to have fights over meanings. Indeed, your impulse to fight for the word-you-want should be deeply suspect; wanting to affiliate our ideas with pleasant-sounding words is very similar to our desire to affiliate with high-status others; it makes us (or our ideas) appealing for reasons that are unrelated to the correctness or usefulness of what we are saying.

Arguing over definitions is pointless if we're trying to name ideas. Arguing over definitions is absolutely necessary if there's disagreement over how to understand the stated positions of a third party. Establishing clear definitions is extremely important.

If someone has committed themselves to rationality, it's natural for us to ask "what do they mean by 'rationality'?" They should already have a clear and ready definition, which once provided, we can use to understand their commitment.

Sure, it is useful to ask for clarification when we don't understand what someone is saying. But we don't need to settle on one "correct" meaning of the term in order to accomplish this. We can just recognize that the word is used to refer to a combination of characteristics that cognitive activity might possess. I.e. "rationality" usually refers to thinking that is correct, clear, justified by available evidence, free of logical errors, non-circular, and goal-promoting. Sometimes this general sense may not be specific enough, particularly where different aspects of rationality conflict with each other. But then we should use other words, not seek to make rationality into a different concept.

"But we don't need to settle on one "correct" meaning of the term in order to accomplish this. "

We do in order to understand what we're saying, and for others to understand us. Switching back and forth between different meanings can not only confuse other people but confuse ourselves.

To reach truly justified conclusions, our reasoning must be logically equivalent to syllogisms, with all of the precision and none of the ambiguity that implies.

I think we're missing a fairly basic definition of rationality, one that I think most people would intuitively come to. It involves the question at what stage evidence enters the decision-making calculus.

Rationality is a process: it involves making decisions after weighing all available evidence and calculating the ideal response. Relevant information is processed consciously [though see Clarification below] before decision is rendered.

This approach is opposed to a different, less conscious process, which are our instinctive and emotional responses to situations. In these situations, actual evidence doesn't enter the conscious decision-making process; instead, our brains, having evolved over time to respond in certain ways to certain stimuli, automatically react in certain pre-programmed ways. Those ways aren't random, of course, but adaptions to the ancestral environment. The key is that evidence specific to the situation isn't actually weighed and measured: the response is based on the brain's evolved automatic reaction.

Clarification: A process that is nearly automatic is still a rational process if it is the result of repeated training, rather than innate. For example, those who drive manual transmission cars will tell you that after a short while, you don't think about shifting: you just do. It becomes "second nature." This is still a conscious process: over time, you become trained to interpret information more efficiently and react quickly. This differs from the innate emotional and instinctive responses: we are instinctively attracted to beautiful people, for example, without having to learn it over and over again - it's "first nature." Though the responses are similar in appearance, I think most people would say that the former is rational, the latter is not.

Maybe this is a can of worms

Yes, but hopefully not if we're careful. A quick re-read of 37 Ways That Words Can Be Wrong might be useful for avoiding a few of them.

[-][anonymous]15y00

#Fundamental Definitions of Rationality

###Rationality(Thought)

  • The art of being Right.
  • To be rational is to form accurate beliefs from the available evidence and produce well calibrated predictions.
  • The rational agent is supplied with information about a territory and has the role of building the best map possible.
  • Can be considered as 'passive' in as much as the 'rational' part here is not about making decisions or taking actions. However, it does involve active processing of the data, correcting for biasses and perhaps actively seeking new evidence as appropriate.

###Rationality(Decision Making)

  • The art of Winning
  • To be rational is to make the decisions that maximise expected utility.
  • The rational agent has information available and must use it to determine in each instant which action to take.
  • The more 'active' definition of rationality.
  • Forming accurate beliefs can be expected to play an instrumental role in making rational choices but is not the priority.
  • Believing 'true' things may come with a cost. If the agent knows it is biassed it may decide not to correct certain biasses because the resources are best spent in another area.
  • Believing 'true' things may actually lead to a lower utility than believing certain false things. This is the case if believing the truth leads to lower self confidence, a drain on emotional wellbeing or is a liability when signalling to others.