LessWrong Jargon

Ruby (+31/-17)
WrongPlanet (+253/-233) I added letters in order to make the terms easier to find
brook (-41)
brook (+796/-568)
Deku-shrub (-1)
alti (+338) Black swan
Deku-shrub (-2)
Deku-shrub (+65)
TheAncientGeek (+326/-3) Add Strong Man, Weak Man
TheAncientGeek (+223/-160) Bayesian has more than one meaning.

This is a short list of common terms and phrasesphrases, i.e., jargon used on LessWrong.LessWrong.


A

ADBOC: Agree Denotationally, But Object Connotatively. Discussion in When Truth Isn'Isn't Enough

B

Black swan: In the usage of Nassim Nicholas Taleb, a black swan is a rare event whose magnitude is so high as to impact the average of a series. These are characteristic of 'fat-tailed''fat-tailed' distributions, as opposed to thin-tailed distributions such as the normal distribution, in which rare events are too unlikely to have a large impact.

C

CEV: Coherent Extrapolated Volition, ""In poetic terms, our coherent extrapolated volition is our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted.""

Clever arguer: Someone skilled at writing convincing-sounding arguments for an existing belief. Inventing clever arguments for a belief does not change the truth value of the belief. Discussed in ""The Bottom Line"Line"

D

Deontology/ deontological ethics: An approach to ethics that judges the morality of an action based on the action'action's adherence to a rule or rules. See Wikipedia article on deontological ethics for more. Contrast consequentialism.

E

EEA: Environment of Evolutionary Adaptedness. An Evolutionary psychology term synonymous with the more commonly-used "ancestral environment""ancestral environment". For humans, refers to the state of tribal bands of hunter-gatherers.

Egan'Egan's law: ""It all adds up to normality."" Surprising truths do not make the sky orange and grey; it stays blue.

ETA: Edited To Add (though some would rather you say "Edit:""Edit:" instead)

F

Fuzzies: The desired but less useful counterpart to utils. They make you feel you'you're altruistic and socially contributing.

H

I

I don'don't know: Something that can'can't be entirely true if you can even formulate a question.

Inferential distance: The number of inferences, or intermediate steps, it takes someone to get from their existing knowledge to an understanding of the point you'you're making. See also illusion of transparency.

K

L

M

MWI: Many-Worlds Interpretation, an interpretation of quantum mechanics advocated in Eliezer Yudkowsky'Yudkowsky's quantum mechanics sequence

N

Noncentral fallacy: A rhetorical move often used in political, philosophical, and cultural arguments. "X"X is in a category whose archetypal member gives us a certain emotional reaction. Therefore, we should apply that emotional reaction to X, even though it is not a central category member.""

O

One-box: One of the choices for Newcomb'Newcomb's problem.

Omega: A hypothetical superintelligent being, canonically found in Newcomb'Newcomb's problem.

P

Password: The answer you guess instead of actually understanding the problem. See Guessing the teacher'teacher's password

PD: Prisoner'Prisoner's dilemma

Privileging the hypothesis: The fallacy of singling out a specific hypothesis for investigation when there isn'isn't enough evidence at hand to select this hypothesis over others. e.g.""We have no idea who committed the murder, so let'let's consider the possibility that Mortimer Q. Snodgrass did it, and investigate him." "

Or ""The origin of the universe sure is mysterious! Have you considered that it could have been done by the God of the Bible?""

Q

R

Reversed stupidity is not intelligence: ""The world'world's greatest fool may say the Sun is shining, but that doesn'doesn't make it dark out.""

Reality is normal: See also: Egan'Egan's Law
Semantic stopsign: A term that looks like an explanation but, on closer examination, doesn'doesn't actually explain anything. Also called curiosity stopper.

S

Solomonoff induction: A formalized version of Occam'Occam's razor based on Kolmogorov complexity.

Steel man: A term for the opposite of a Straw Man: the strongest possible form of an opponent'opponent's argument, even if they didn'didn't make it themself.

T

tl;dr : Too long; didn'didn't read. Polite use: one-line summary at top of your long article. Impolite use: dismissive response to another'another's long piece of writing or unparagraphed slab of text.

Two-box: One of the choices for Newcomb'Newcomb's problem.

Tsuyoku naritai: Japanese: "I"I want to become stronger.""

U

Utility function: A utility function assigns numerical values ("utilities"("utilities") to outcomes, in such a way that outcomes with higher utilities are always preferred to outcomes with lower utilities.

Utils: Units of utility; sometimes called "utilons""utilons". Contrast fuzzies.

W

Y

ADBOC

ADBOC:Agree Denotationally, But Object Connotatively

Connotatively. Discussion in When Truth Isn'Isn't Enough

AFAICT

AFAICT: As Far As I Can Tell

Affect

Affect: Mood or emotion as demonstrated in external physical signs.

: When positive attributions combine with the halo effect in a positive feedback loop.

AGI

AGI: Artificial general intelligence

: Bad rules for thinking itself, capable of protecting false beliefs.

Bayesian:

1.

  • A theory of probability based on updating subjective estimates in the light of new evidence; contrastedevidence. Contrasted with the more objective frequentist approach.

2. approach, which views probability as being the mean of an infinite series of the same experiment.

  • Probablistic reasoning in general.
  • 3.

  • Good, well done reasoning in general.
  • : A fictional secret society of Bayesians.

    : What you do to your beliefs, opinions and cognitive structure when new evidence comes along.

    Black swan

    swan: In the usage of Nassim Nicholas Taleb, a black swan is a rare event whose magnitude is so high as to impact the average of a series, aseries. These are characteristic of 'fat-tailed' distributions, as opposed to the thin-tailed distributions such as the normal distributionindistribution, in which rare events are vanishinglytoo unlikely to have such a large impact.

    :Roman Empire chariot-racing teams that became part of politics. Used in place of real party names. See Mind-killer.killer.

    : Coherent Extrapolated Volition

    ", "In poetic terms, our coherent extrapolated volition is our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted."

    Clever arguer

    arguer:Someone skilled at writing convincing-sounding arguments for an existing belief. Inventing clever arguments for a belief does not change the truth value of the belief.

    Discussed in ""The Bottom Line"Line"

    : A moral theory that places value on the consequences of actions. Covered in more depth here.

    : What to have when you may have been quite wrong for a long time.

    :Rhetorical techniques crafted to exploit human cognitive biases. Considered bad behaviour even if the belief you want to communicate is good.

    Deontology/ deontological ethics

    Deontological ethics (from Greek deon, "obligation, duty"; and -logia) is anethics: An approach to ethics that judges the morality of an action based on the action'action's adherence to a rule or rules. See Wikipedia article on deontological ethics for more. Contrast consequentialism.

    EEA

    EEA: Environment of Evolutionary Adaptedness

    Adaptedness. An Evolutionary psychology term synonymous with the more commonly-used "ancestral environment""ancestral environment". For humans, refers to the state of tribal bands of hunter-gatherers.

    Egan'Egan's law

    ": "It all adds up to normality."" Surprising truths do not make the sky orange and grey; it stays blue.

    : Edited To Add (though some would rather you say "Edit:""Edit:" instead)

    : Eliezer Yudkowsky

    FAI

    FAI: Friendly AI

    : Onomatopoetic vernacular for an intelligence explosion.

    : An argument which can be used to discount any conclusion the arguer does not like.

    : The desired but less useful counterpart to utils. They make you feel you'you're altruistic and socially contributing.

    : A unit philosophers use to quantify pleasure. (Note: no actual quantifying is done.)

    : What Spock does, not what actual rationalists do.

    : Intelligence augmentation

    IAWYC

    IAWYC: I Agree With Your Conclusion

    Conclusion. Generally used when nitpicking, to make it clear that the nitpicks are not meant to represent actual disagreement. Discussed in Support That Sounds Like Dissent.

    I don'don't know

    : Something that can'can't be entirely true if you can even formulate a question.

    IMO/IMHO

    IMHO: In my (humble) opinion

    : The number of inferences, or intermediate steps, it takes someone to get from their existing knowledge to an understanding of the point you'you're making. See also illusion of transparency.

    ISTM

    ISTM: It Seems To Me

    : Given a string, the length of the shortest possible program that prints it.See also: Solomonoff induction

    LCPW

    LCPW: Least convenient possible world

    . A technique used to prevent oneself from evading the point of a question by nitpicking details.

    : A response to criticism which insulates the responder from having to address the criticism directly, without appearing to be conventional rudeness.

    LW

    LW: Less Wrong

    : Groups of Less Wrong members sometimes arrange to meet each other in meat space.space (in person). Some geographic areas have groups that do this regularly.

    : A topic that reliably produces biased discussions, e.g. politics or Pick-Up Artists.

    MoR

    MoR: Also HPMoR, Harry Potter and the Methods of Rationality

    : Reasoning used to reach desired conclusions rather than true conclusions.

    MWI

    MWI: Many-Worlds Interpretation, an interpretation of quantum mechanics advocated in Eliezer Yudkowsky'Yudkowsky's quantum mechanics sequence

    NPC

    NPC: Non-Player Character (think of an MMORPG like World of Warcraft that has characters controlled by humans and characters controlled by a computer; the characters controlled by humans would be PCs (player characters) and the characters controlled by a computer would be NPCs (non-player characters)).

    : A rhetorical move often used in political, philosophical, and cultural arguments. "X"X is in a category whose archetypal member gives us a certain emotional reaction. Therefore, we should apply that emotional reaction to X, even though it is not a central category member.""

    OB

    OB: Overcoming Bias

    One-box

    box: One of the choices for Newcomb'Newcomb's problem.

    : A hypothetical superintelligent being, canonically found in Newcomb'Newcomb's problem.

    Ontology/ontological

    ontological: The philosophical study of the nature of being, existence, or reality, deals with questions concerning what entities exist or can be said to exist, and how such entities can be grouped, related within a hierarchy, and subdivided according to similarities and differences. See also ontological argument at Wikipedia for an example of (ab)using ontology to try and prove the existence of God.

    : An AI that has been created to maximize the number of paperclips in the universe. A hypothetical unfriendly artificial intelligence.

    : A group estimation game in which one player, unknown to the others, tries to subvert the group estimate.

    Password

    Password: The answer you guess instead of actually understanding the problem. See Guessing the teacher'teacher's password

    PC

    PC: Player Character (think of an MMORPG like World of Warcraft that has characters controlled by humans and characters controlled by a computer; the characters controlled by humans would be PCs (player characters) and the characters controlled by a computer would be NPCs (non-player characters))

    PCT

    PCT: Perceptual control theory

    PD

    PD: Prisoner'Prisoner's dilemma

    Philosophical zombie or P-Zombie

    Zombie: A creature which looks and behaves indistinguishably from a human down to the atomic level, but is not conscious. See Zombies (sequence)

    : What you update from in Bayesian calculations. In practical terms, everything you think you know now.

    : The fallacy of singling out a specific hypothesis for investigation when there isn'isn't enough evidence at hand to select this hypothesis over others.

    " e.g."We have no idea who committed the murder, so let'let's consider the possibility that Mortimer Q. Snodgrass did it, and investigate him.""

    "Or "The origin of the universe sure is mysterious! Have you considered that it could have been done by the God of the Bible?"

    QALY

    QALY: Quality-adjusted life year; a concept from the economics of health care

    : The mode of thinking within the rationalist movement

    : A technique for unpacking words into concepts: taboo the use of a given word or its synonyms. Particularly useful in arguments over definitions.

    ": "The world'world's greatest fool may say the Sun is shining, but that doesn'doesn't make it dark out.""

    Reality is normal: See also: Egan's Law
    Semantic stopsign

    : A term that looks like an explanation but, on closer examination, doesn'doesn't actually explain anything. Also called curiosity stopper.

    : How to do a utility calculation without scope insensitivity.

    : Conveying information by performing an action which would be costly to perform, if the information were not true.

    : A formalized version of Occam'Occam's razor based on Kolmogorov complexity.

    : A term for the opposite of a Straw Man: the strongest possible form of an opponent'opponent's argument, even if they didn'didn't make it themself.

    Strong man

    man: Another term for the opposite of a Straw Man: the strongest actual form of an opponents argument.

    : Discussing an event as though it were caused by its future consequences.

    : Too long; didn'didn't read.

    Polite use: one-line summary at top of your long article. Impolite use: dismissive response to another'another's long piece of writing or unparagraphed slab of text.

    : When LessWrong was started, Eliezer put a temporary moratorium on discussion of the Singularity or AI. You will see this used in old discussions to allude to these topics.

    Two-box

    box: One of the choices for Newcomb'Newcomb's problem.

    : Japanese: "I"I want to become stronger.""

    : Unfriendly AI

    : A subject that is thought about less over time due to behavioral conditioning.

    : A utility function assigns numerical values ("utilities"("utilities") to outcomes, in such a way that outcomes with higher utilities are always preferred to outcomes with lower utilities.

    : Units of utility; sometimes called "utilons""utilons". Contrast fuzzies.

    Update

    Update: See Belief update

    Weak Man

    Man: The opposite of a Strong Man, and relative to a Straw Man: the weakest version of your opponents actual arguments.

    WBE

    WBE:Whole Brain Emulation

    YMMV

    YMMV: Your Mileage May Vary

    . See Other-optimizing

    Black swan

    In the usage of Nassim Nicholas Taleb, a black swan is a rare event whose magnitude is so high as to impact the average of a series, a characteristic of fat-tailed distributions, as opposed to the thin-tailed distributions such as the normal distributionin which rare events are vanishingly unlikely to have such a large impact.

    A term for the opposite of a Straw ManMan: the strongest possible form of an opponent's argument, even if they didn't make it themself.

    Strong man

    Another term for the opposite of a Straw Man: the strongest actual form of an opponents argument.

    Weak Man

    The opposite of a Strong Man, and relative to a Straw Man: the weakest version of your opponents actual arguments.

    The secret technical codeword that cognitive scientists use to mean "rational". Right up there1. A theory of probability based on updating subjective estimates in the light of new evidence; contrasted with cognitive bias as an absolutely fundamental concept on Less Wrong.the more objective frequentist approach.

    2. Probablistic reasoning in general.

    3. Good, well done reasoning in general.

    Load More (10/88)