Rationality is the art of thinking in ways that result in accurate beliefs and good decisions. It is the primary topic of LessWrong.
Rationality is not only about avoiding the vices of self-deception and obfuscation (the failure to communicate clearly), but also about the virtue of curiosity, seeing the world more clearly than before, and achieving things previously unreachable to you. The study of rationality on LessWrong includes a theoretical understanding of ideal cognitive algorithms, as well as building a practice that uses these idealized algorithms to inform heuristics, habits, and techniques, to successfully reason and make decisions in the real world.
Topics covered in rationality include (but are not limited to): normative and theoretical explorations of ideal reasoning; the capabilities and limitations of our brain, mind and psychology; applied advice such as introspection techniques and how to achieve truth collaboratively; practical techniques and methodologies for figuring out what’s true ranging from rough quantitative modeling to full research guides.
Theory / Concepts
Models of the Mind
This list is not comprehensive! The tagging system is new. Many needed tags have not been created and/or added to the above list.
A good heuristic is that rationality is about cognitive algorithms. Rather than being a synonym for true or optimal, the term rational should be reserved for describing whether or not a cognitive algorithm results in true beliefs and optimal actions.
This is distinct from practical advice, such as how to improve relationships or implement productivity systems, which should not be considered "rationality" per se. Some have pushed against labeling self-help as "rational dating", etc. for reasons along these lines [1, 2], and they are probably correct.
In accordance with this, LessWrong classifies most self-help type advice under the World Optimization tag and not the Rationality tag.
Similarly, most object-level material about how the world is, e.g. math, biology, history, etc. is tagged under World Modeling tag, with exceptions for neuroscience and probability theory, etc., which have concrete consequences for how one ought to think.
Early material on LessWrong frequently describes rationality with reference to heuristics and biases [1, 2]. Indeed, LessWrong grew out of the blog Overcoming Bias and even Rationality: A-Z opens with a discussion of biases  with the opening chapter titled Predictably Wrong. The idea is that human mind has been shown to systematically make certain errors of reasoning, like confirmation bias. Rationality then consists of overcoming these biases.
Apart from the issue of the replication crises which discredited many examples of bias that were commonly referenced on LessWrong, e.g. priming, the "overcoming biases" frame of rationality is too limited. Rationality requires the development of many positive skills, not just removing negative biases to reveal underlying perfect reasoning. These are skills such as how to update the correct amount in response to evidence, how to resolve disagreements with others, how to introspect, and many more.
Classically, on LessWrong, a distinction has been made between instrumental rationality and epistemic rationality, however, these terms may be misleading – it's not as though epistemic rationality can be traded off for gains in instrumental rationality. Only apparently, and to think one should do this is a trap.
Instrumental rationality is defined as being concerned with achieving goals. More specifically, instrumental rationality is the art of choosing and implementing actions that steer the future toward outcomes ranked higher in one's preferences. Said preferences are not limited to 'selfish' preferences or unshared values; they include anything one cares about.
Epistemic rationality is defined as the part of rationality which involves achieving accurate beliefs about the world. It involves updating on receiving new evidence, mitigating cognitive biases, and examining why you believe what you believe. It can be seen as a form of instrumental rationality in which knowledge and truth are goals in themselves, whereas in other forms of instrumental rationality, knowledge and truth are only potential aids to achieving goals. Someone practicing instrumental rationality might even find falsehood useful.
In a field like biology, we can draw a distinction between the science of biology, which involves various theories and empirical data about biological life, and the art of being a biologist, which is the specific way that a biologist thinks and plays with ideas and interacts to the world around them. Similarly, rationality is both a science and an art. There’s study of the iron-clad laws of reasoning and mechanics of the human mind, but there’s also the general training to be the kind of person who reasons well.
The term rationalist as a description of people is used in a couple of ways. It can refer to someone who endeavors to think better and implement as much rationality as they can. Many prefer the term aspiring rationalist to convey that the identifier is a claim to the goal of being more rational rather than a claim of having attained it already.
Perhaps more commonly, rationalist is used to refer culturally to someone associated with various rationalist communities separate from their efforts to improve their rationality.