LessWrong Canon On Rationality

  • Absolute certainty - equivalent of Bayesian probability of 1. Losing an epistemic bet made with absolute certainty corresponds to receiving infinite negative payoff, according to the logarithmic proper scoring rule. Cromwell's rule states that the use of prior probabilities of 0 or 1 should be avoided, except when applied to statements that are logically true or false.
  • Adaptation executors - Individual organisms are best thought of as adaptation-executers rather than as fitness-maximizers. Our taste buds do not find lettuce delicious and cheeseburgers distasteful once we are fed a diet too high in calories and too low in micronutrients. Tastebuds are adapted to an ancestral environment in which calories, not micronutrients, were the limiting factor. Evolution operates on too slow a timescale to re-adapt to adapt to a new conditions (such as a diet).
  • Adversarial process - a form of truth-seeking or conflict resolution in which identifiable factions hold one-sided positions.
  • Altruism - Actions undertaken for the benefit of other people. If you do something to feel good about helping people, or even to be a better person in some spiritual sense, it isn't truly altruism.
  • Amount of evidence - to a Bayesian, evidence is a quantitative concept. The more complicated or a priori improbable a hypothesis is, the more evidence you need just to justify it, or even just single it out of the amongst the mass of competing theories.
  • Anti-epistemology- is bad explicit beliefs about rules of reasoning, usually developed in the course of protecting an existing false belief - false beliefs are opposed not only by true beliefs (that must then be obscured in turn) but also by good rules of systematic reasoning (which must then be denied). The explicit defense of fallacy as a general rule of reasoning is anti-epistemology.
  • Antiprediction - is a statement of confidence in an event that sounds startling, but actually isn't far from a maxentropy prior. For example, if someone thinks that our state of knowledge implies strong ignorance about the speed of some process X on a logarithmic scale from nanoseconds to centuries, they may make the startling-sounding statement that X is very unlikely to take 'one to three years'.
  • Applause light - is an empty statement which evokes positive affect without providing new information
  • Artificial general intelligence - is a machine capable of behaving intelligently over many domains.
  • Bayesian - Bayesian probability theory is the math of epistemic rationality, Bayesian decision theory is the math of instrumental rationality.
  • Aumann's agreement theorem - roughly speaking, says that two agents acting rationally (in a certain precise sense) and with common knowledge of each other's beliefs cannot agree to disagree. More specifically, if two people are genuine Bayesians, share common priors, and have common knowledge of each other's current probability assignments, then they
...
Read More (6701 more words)
Created by ScottL at 4y
  • Absolute certainty - equivalent of Bayesian probability of 1. Losing an epistemic bet made with absolute certainty corresponds to receiving infinite negative payoff, according to the logarithmic proper scoring rule. Cromwell's rule states that the use of prior probabilities of 0 or 1 should be avoided, except when applied to statements that are logically true or false.
  • Adaptation executors - Individual organisms are best thought of as adaptation-executers rather than as fitness-maximizers. Our taste buds do not find lettuce delicious and cheeseburgers distasteful once we are fed a diet too high in calories and too low in micronutrients. Tastebuds are adapted to an ancestral environment in which calories, not micronutrients, were the limiting factor. Evolution operates on too slow a timescale to re-adapt to adapt to a new conditions (such as a diet).
  • Adversarial process - a form of truth-seeking or conflict resolution in which identifiable factions hold one-sided positions.
  • Altruism - Actions undertaken for the benefit of other people. If you do something to feel good about helping people, or even to be a better person in some spiritual sense, it isn't truly altruism.
  • Amount of evidence - to a Bayesian, evidence is a quantitative concept. The more complicated or a priori improbable a hypothesis is, the more evidence you need just to justify it, or even just single it out of the amongst the mass of competing theories.
  • Anti-epistemology- is bad explicit beliefs about rules of reasoning, usually developed in the course of protecting an existing false belief - false beliefs are opposed not only by true beliefs (that must then be obscured in turn) but also by good rules of systematic reasoning (which must then be denied). The explicit defense of fallacy as a general rule of reasoning is anti-epistemology.
  • Antiprediction - is a statement of confidence in an event that sounds startling, but actually isn't far from a maxentropy prior. For example, if someone thinks that our state of knowledge implies strong ignorance about the speed of some process X on a logarithmic scale from nanoseconds to centuries, they may make the startling-sounding statement that X is very unlikely to take 'one to three years'.
  • Applause light - is an empty statement which evokes positive affect without providing new information
  • Artificial general intelligence - is a machine capable of behaving intelligently over many domains.
  • Bayesian - Bayesian probability theory is the math of epistemic rationality, Bayesian decision theory is the math of instrumental rationality.
  • Aumann's agreement theorem - roughly speaking, says that two agents acting rationally (in a certain precise sense) and with common knowledge of each other's beliefs cannot agree to disagree. More specifically, if two people are genuine Bayesians, share common priors, and have common knowledge of each other's current probability assignments, then they
...
Read More (6702 more words)
  • Affective death spiral - positive attributes of a theory, person, or organization combine with the Halo effect in a feedback loop, resulting in the subject of the affective death spiral being held in higher and higher regard.
  • Anthropomorphism - the error of attributing distinctly human characteristics to nonhuman processes.
  • Bystander effect - a social psychological phenomenon in which individuals are less likely to offer help in an emergency situation when other people are present.
  • Connotation - emotional association with a word. You need to be careful that you are not conveying different connotation, then you mean to.
  • Correspondence bias (also known as the fundamental attribution error) - is the tendency to overestimate the contribution of lasting traits and dispositions in determining people's behavior, as compared to situational effects.
  • Death Spirals and the Cult Attractor - Cultishness is an empirical attractor in human groups, roughly an affective death spiral, plus peer pressure and outcasting behavior, plus (quite often) defensiveness around something believed to have been perfected
  • Detached lever fallacy -the assumption that something simple for one system will be simple for others. This assumption neglects to take into account that something may only be simple because of complicated underlying machinery which is triggered by a simple action like pulling a lever. Adding this lever to something else won-won't allow the action to occur because the underlying complicated machinery is not there.
  • Giant cheesecake fallacy- occurs when an argument leaps directly from capability to actuality, without considering the necessary intermediate of motive. An example of the fallacy might be: a sufficiently powerful Artificial Intelligence could overwhelm any human resistance and wipe out humanity. (Belief without evidence: the AI would decide to do so.) Therefore we should not build AI.
  • Halo effect - specific type of confirmation bias, wherein positive feelings in one area cause ambiguous or neutral traits to be viewed positively.
  • Illusion of transparency - misleading impression that your words convey more to others than they really do.
  • Inferential distance - a gap between the background knowledge and epistemology of a person trying to explain an idea, and the background knowledge and epistemology of the person trying to understand it.
  • Information cascade - occurs when people signal that they have information about something, but actually based their judgment on other people's signals, resulting in a self-reinforcing community opinion that does not necessarily reflect reality.
  • Mind projection fallacy - occurs when someone thinks that the way they see the world reflects the way the world really is, going as far as assuming the real existence of imagined objects.
  • Other-optimizing - a failure mode in which a person vastly overestimates their ability to optimize someone else's life, usually as a result of underestimating the differences between themselves and others, for example through the typical mind fallacy.
  • Peak-end
...
Read More (11726 more words)
  1. Curiosity - the burning itch
  2. Relenquishment - That which can be destroyed by the truth should be. - P. C. Hodgell
  3. Lightness - follow the evidence wherever it leads
  4. Evenness - resist selective skepticism; use reason, not rationalization
  5. Argument - do not avoid arguing; strive for exact honesty; fairness does not mean balancing yourself evenly between propositions
  6. Empiricism - knowledge is rooted in empiricism and its fruit is prediction; argue what experiences to anticipate, not which beliefs to profess
  7. Simplicity - is virtuous in belief, design, planning, and justification; ideally: nothing left to take away, not nothing left to add
  8. Humility - take actions, anticipate errors; do not boast of modesty; no one achieves perfection
  9. Perfectionism - seek the answer that is perfectly right - do not settle for less
  10. Precision - the narrowest statements slice deepest; don-don't walk but dance to the truth
  11. Scholarship - absorb the powers of science
  12. The nameless virtue (The void) - More than anything, you must think of carrying your map through to reflecting the territory.
  • Weirdness points (also known as idiosyncrasy credits) - Each extra weird belief you have detracts from your ability to spread other, perhaps more important, weird memes. Therefore normal beliefs should be preferred to some extent, even when you expect them to be less correct or less locally useful on an issue, in order to improve your overall effectiveness at spreading your most highly valued memes.
  • Absolute certainty - equivalent of Bayesian probability of 1. Losing an epistemic bet made with absolute certainty corresponds to receiving infinite negative payoff, according to the logarithmic proper scoring rule. Cromwell's rule states that the use of prior probabilities of 0 or 1 should be avoided, except when applied to statements that are logically true or false.
  • Adaptation executors - Individual organisms are best thought of as adaptation-executers rather than as fitness-maximizers. Our taste buds do not find lettuce delicious and cheeseburgers distasteful once we are fed a diet too high in calories and too low in micronutrients. Tastebuds are adapted to an ancestral environment in which calories, not micronutrients, were the limiting factor. Evolution operates on too slow a timescale to re-adapt to adapt to a new conditions (such as a diet).
  • Adversarial process - a form of truth-seeking or conflict resolution in which identifiable factions hold one-sided positions.
  • Altruism - Actions undertaken for the benefit of other people. If you do something to feel good about helping people, or even to be a better person in some spiritual sense, it isn't truly altruism.
  • Amount of evidence - to a Bayesian, evidence is a quantitative concept. The more complicated or a priori improbable a hypothesis is, the more evidence you need just to justify it, or even just single it out of the amongst the mass of competing theories.
  • Anti-epistemology- is bad explicit beliefs about rules of reasoning, usually developed in the course of protecting an existing false belief - false beliefs are opposed not only by true beliefs (that must then be obscured in turn) but also by good rules of systematic reasoning (which must then be denied). The explicit defense of fallacy as a general rule of reasoning is anti-epistemology.
  • Antiprediction - is a statement of confidence in an event that sounds startling, but actually isn't far from a maxentropy prior. For example, if someone thinks that our state of knowledge implies strong ignorance about the speed of some process X on a logarithmic scale from nanoseconds to centuries, they may make the startling-sounding statement that X is very unlikely to take 'one to three years'.
  • Applause light - is an empty statement which evokes positive affect without providing new information
  • Artificial general intelligence - is a machine capable of behaving intelligently over many domains.
  • Bayesian - Bayesian probability theory is the math of epistemic rationality, Bayesian decision theory is the math of instrumental rationality.
  • Aumann's agreement theorem - roughly speaking, says that two agents acting rationally (in a certain precise sense) and with common knowledge of each other's beliefs cannot agree to disagree. More specifically, if two people are genuine Bayesians, share common priors, and have common knowledge of each other's current probability assignments, then they
...
Read More (6702 more words)
  • Akrasia - the state of acting against one's better judgment. Note that, for example, if you are procrastinating because it's not in your best interest to complete the task you are delaying, it is not a case of akrasia.
  • Alief - an independent source of emotional reaction which can coexist with a contradictory belief. For example, the fear felt when a monster jumps out of the darkness in a scary movie is based on the alief that the monster is about to attack you, even though you believe that it cannot.
  • Anti-inductiveness (also known as the reverse Tinkerbell effect) - it is the idea that the market would stop being efficient if everyone acted like it already was efficient. Another example is that of a vote in a democracy. The more people that believe their vote counts towards the outcome of an election, the less their votes count, as there is a greater population of voters, and thus each individual voter has a lower percentage of total votes.
  • Effort Shock - the unpleasant discovery of how hard it is to accomplish something.
  • Absurdity heuristic - is a mental shortcut where highly untypical situations are classified as absurd or impossible. Where you don't expect intuition to construct an adequate model of reality, classifying an idea as impossible may be overconfident.
  • Affect heuristic - a mental shortcut that makes use of current emotions to make decisions and solve problems quickly and efficiently.
  • Arguing by analogy - is arguing that since things are alike in some ways, they will probably be alike in others. While careful application of argument by analogy can be a powerful tool, there are limits to the method after which it breaks down.
  • Arguing by definition - is arguing that something is part of a class because it fits the definition of that class. It is recommended to avoid this wherever possible and instead treat words as labels that cannot capture the rich cognitive content that actually constitutes its meaning. As Feynman said: -You can know the name of a bird in all the languages of the world, but when you're finished, you'll know absolutely nothing whatever about the bird... So let's look at the bird and see what it's doing -- that's what counts.- It is better to keep the focus on the facts of the matter and try to understand what your interlocutor is trying to communicate, then to get lost in a pointless discussion of definitions, bearing nothing.
  • Arguments as soldiers - is a problematic scenario where arguments are treated like war or battle. Arguments get treated as soldiers, weapons to be used to defend your side of the debate, and to attack the other side. They are no longer instruments of the truth.
  • Availability heuristic - a mental shortcut that treats easily recalled information as important or at least more important than alternative solutions which are not as readily recalled
  • Belief as cheering - People can bind themselves as a group by believing "crazy" things together. Then among outsiders they could show the same pride in their crazy belief as they would show wearing "crazy" group clothes among outsiders. The belief is more like a banner saying "GO BLUES". It isn't a statement of fact, or an attempt to persuade; it doesn't have to be convincing-it's a cheer.
  • Beware of Deepities - A deepity is a proposition that seems both important and true-and profound-but that achieves this effect by being ambiguous. An example is "love is a word". One interpretation is that -love-, the word, is a word and this is trivially true. The second interpretation is that love is nothing more than a verbal construct. This interpretation is false, but if it were true would be profound. The "deepity" seems profound due to a conflation of the two interpretations. People see the trivial
...
Read More (10618 more words)
  • Absolute certainty - equivalent of Bayesian probability of 1. Losing an epistemic bet made with absolute certainty corresponds to receiving infinite negative payoff, according to the logarithmic proper scoring rule.
  • Adaptation executors - Individual organisms are best thought of as adaptation-executers rather than as fitness-maximizers. Our taste buds do not find lettuce delicious and cheeseburgers distasteful once we are fed a diet too high in calories and too low in micronutrients. Tastebuds are adapted to an ancestral environment in which calories, not micronutrients, were the limiting factor. Evolution operates on too slow a timescale to re-adapt to adapt to a new conditions (such as a diet).
  • Adversarial process - a form of truth-seeking or conflict resolution in which identifiable factions hold one-sided positions.
  • Altruism - Actions undertaken for the benefit of other people. If you do something to feel good about helping people, or even to be a better person in some spiritual sense, it isn't truly altruism.
  • Amount of evidence - to a Bayesian, evidence is a quantitative concept. The more complicated or a priori improbable a hypothesis is, the more evidence you need just to justify it, or even just single it out of the amongst the mass of competing theories.
  • Anti-epistemology- is bad explicit beliefs about rules of reasoning, usually developed in the course of protecting an existing false belief - false beliefs are opposed not only by true beliefs (that must then be obscured in turn) but also by good rules of systematic reasoning (which must then be denied). The explicit defense of fallacy as a general rule of reasoning is anti-epistemology.
  • Antiprediction - is a statement of confidence in an event that sounds startling, but actually isn't far from a maxentropy prior. For example, if someone thinks that our state of knowledge implies strong ignorance about the speed of some process X on a logarithmic scale from nanoseconds to centuries, they may make the startling-sounding statement that X is very unlikely to take 'one to three years'.
  • Applause light - is an empty statement which evokes positive affect without providing new information
  • Artificial general intelligence - is a machine capable of behaving intelligently over many domains.
  • Bayesian - Bayesian probability theory is the math of epistemic rationality, Bayesian decision theory is the math of instrumental rationality.
  • Aumann's agreement theorem - roughly speaking, says that two agents acting rationally (in a certain precise sense) and with common knowledge of each other's beliefs cannot agree to disagree. More specifically, if two people are genuine Bayesians, share common priors, and have common knowledge of each other's current probability assignments, then they must have equal probability assignments.
  • Bayesian decision theory - is a decision theory which is informed by Bayesian probability. It is a statistical system that tries to quantify
...
Read More (9141 more words)

Property Attribution

Barriers, biases, fallacies, impediments and problems

  • Affective death spiral - positive attributes of a theory, person, or organization combine with the Halo effect in a feedback loop, resulting in the subject of the affective death spiral being held in higher and higher regard.
  • Anthropomorphism - the error of attributing distinctly human characteristics to nonhuman processes.
  • Bystander effect - a social psychological phenomenon in which individuals are less likely to offer help in an emergency situation when other people are present.
  • Connotation - emotional association with a word. You need to be careful that you are not conveying different connotation, then you mean to.
  • Correspondence bias (also known as the fundamental attribution error) - is the tendency to overestimate the contribution of lasting traits and dispositions in determining people's behavior, as compared to situational effects.
  • Death Spirals and the Cult Attractor - Cultishness is an empirical attractor in human groups, roughly an affective death spiral, plus peer pressure and outcasting behavior, plus (quite often) defensiveness around something believed to have been perfected
  • Detached lever fallacy -the assumption that something simple for one system will be simple for others. This assumption neglects to take into account that something may only be simple because of complicated underlying machinery which is triggered by a simple action like pulling a lever. Adding this lever to something else won-t allow the action to occur because the underlying complicated machinery is not there.
  • Giant cheesecake fallacy- occurs when an argument leaps directly from capability to actuality, without considering the necessary intermediate of motive. An example of the fallacy might be: a sufficiently powerful Artificial Intelligence could overwhelm any human resistance and wipe out humanity. (Belief without evidence: the AI would decide to do so.) Therefore we should not build AI.
  • Halo effect - specific type of confirmation bias, wherein positive feelings in one area cause ambiguous or neutral traits to be viewed positively.
  • Illusion of transparency - misleading impression that your words convey more to others than they really do.
  • Inferential distance - a gap between the background knowledge and epistemology of a person trying to explain an idea, and the background knowledge and epistemology of the person trying to understand it.
  • Information cascade - occurs when people signal that they have information about something, but actually based their judgment on other people's signals, resulting in a self-reinforcing community opinion that does not necessarily reflect reality.
  • Mind projection fallacy - occurs when someone thinks that the way they see the world reflects the way the world really is, going as far as assuming the real existence of imagined objects.
  • Other-optimizing - a failure mode in which a person vastly overestimates their ability to optimize someone else's life, usually as a result of underestimating the differences between themselves and others, for
...
Read More (13662 more words)