LessWrong Jargon

TAI: Threats of Artificial Intelligence

This is a short list of common terms and phrasesphrases, i.e., jargon used on LessWrong.LessWrong.


A

ADBOC: Agree Denotationally, But Object Connotatively. Discussion in When Truth Isn'Isn't Enough

B

Black swan: In the usage of Nassim Nicholas Taleb, a black swan is a rare event whose magnitude is so high as to impact the average of a series. These are characteristic of 'fat-tailed''fat-tailed' distributions, as opposed to thin-tailed distributions such as the normal distribution, in which rare events are too unlikely to have a large impact.

C

CEV: Coherent Extrapolated Volition, ""In poetic terms, our coherent extrapolated volition is our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted.""

Clever arguer: Someone skilled at writing convincing-sounding arguments for an existing belief. Inventing clever arguments for a belief does not change the truth value of the belief. Discussed in ""The Bottom Line"Line"

D

Deontology/ deontological ethics: An approach to ethics that judges the morality of an action based on the action'action's adherence to a rule or rules. See Wikipedia article on deontological ethics for more. Contrast consequentialism.

E

EEA: Environment of Evolutionary Adaptedness. An Evolutionary psychology term synonymous with the more commonly-used "ancestral environment""ancestral environment". For humans, refers to the state of tribal bands of hunter-gatherers.

Egan'Egan's law: ""It all adds up to normality."" Surprising truths do not make the sky orange and grey; it stays blue.

ETA: Edited To Add (though some would rather you say "Edit:""Edit:" instead)

F

Fuzzies: The desired but less useful counterpart to utils. They make you feel you'you're altruistic and socially contributing.

H

I

I don'don't know: Something that can'can't be entirely true if you can even formulate a question.

Inferential distance: The number of inferences, or intermediate steps, it takes someone to get from their existing knowledge to an understanding of the point you'you're making. See also illusion of transparency.

K

L

M

MWI: Many-Worlds Interpretation, an interpretation of quantum mechanics advocated in Eliezer Yudkowsky'Yudkowsky's quantum mechanics sequence

N

Noncentral fallacy: A rhetorical move often used in political, philosophical, and cultural arguments. "X"X is in a category whose archetypal member gives us a certain emotional reaction. Therefore, we should apply that emotional reaction to X, even though it is not a central category member.""

O

One-box: One of the choices for Newcomb'Newcomb's problem.

Omega: A hypothetical superintelligent being, canonically found in Newcomb'Newcomb's problem.

P

Password: The answer you guess instead of actually understanding the problem. See Guessing the teacher'teacher's password

PD: Prisoner'Prisoner's dilemma

Privileging the hypothesis: The fallacy of singling out a specific hypothesis for investigation when there isn'isn't enough evidence at hand to select this hypothesis over others. e.g.""We have no idea who committed the murder, so let'let's consider the possibility that Mortimer Q. Snodgrass did it, and investigate him." "

Or ""The...

Read More (85 more words)

From the old LessWrong Wiki Discussion Page:

Talk:Jargon

Phyg and Phygish

"Phyg" and "phygish" are used a lot. I'm looking for recommendations on how to define them without putting this page in the wrong Google index. --R claypool 15:03, 10 August 2012 (UTC)

What counts as jargon?

I've recently had an addition or two of mine removed form the jargon file that I disagree with. So let me explain why I've been adding them.

I'm happy to take the definition of jargon to be "the language, especially the vocabulary, peculiar to a particular trade, profession, or group"

Now, I'm a reasonably well-read lay-person, but every so often, when I'm reading a discussion in comments, I'll come upon a word that I have to go look up on wikipedia to understand it before I can figure out what the commenters are talking about.

I consider most examples of this happening to mean that they're using a word that is jargon. In most cases - the words I've not understood were philosophical jargon... ie you have to have studied at least a solid base of philosophy to understand what they mean without reaching for the dictionary.

I'd consider words such as utilitarianism, consequentialism and deontology to be good examples of such philosophical jargon. I might guess at what I think they might mean - but to be sure - a definition (and link to a better explanation) is a good idea to have on hand... and therefore I added them to the jargon file.

The reason being that: if a complete newbie (such as myself) doesn't understand them... then so will other newbies - and we are excluded unnecessarily from the conversation.

My argument is in favour of allowing these words in the jargon file for this reason.

Content of this article

Should this article be a list of Jargon with short descriptions or just repeat the contents of Category:Jargon? MrHen 16:42, 23 April 2009 (UTC)

Random idea: Make this article a list of Jargon with short descriptions, and transclude the contents of this article onto the category page. The reason for duplicating the content on the category page is because when browsing through the category trees, users will often end up on the category page, rather than the article page. --PeerInfinity 17:40, 23 April 2009 (UTC)

Unclear and hard to use

Unless there's overwhelming objection, I'm going to merge the acronyms list to this article and reformat it more like a list of short definitions for the newbie, like most jargon lists I've seen. (Certainly it shouldn't be spread over two pages as it is now, with this page not actually providing any explanations at all.) This should be a single-point info list for n00bs - David Gerard 09:38, 17 April 2011 (UTC)

No objections, so I've done this. Do of course feel free to fix any of my quick definitions you don't like - David Gerard 18:49, 21 April 2011 (UTC)

ADBOC

ADBOC:Agree Denotationally, But Object Connotatively

Connotatively. Discussion in When Truth Isn'Isn't Enough

AFAICT

AFAICT: As Far As I Can Tell

Affect

Affect: Mood or emotion as demonstrated in external physical signs.

: When positive attributions combine with the halo effect in a positive feedback loop.

AGI

AGI: Artificial general intelligence

: Bad rules for thinking itself, capable of protecting false beliefs.

Bayesian:

1.

  • A theory of probability based on updating subjective estimates in the light of new evidence; contrastedevidence. Contrasted with the more objective frequentist approach.

    2. approach, which views probability as being the mean of an infinite series of the same experiment.

  • Probablistic reasoning in general.

    3.

  • Good, well done reasoning in general.

: A fictional secret society of Bayesians.

: What you do to your beliefs, opinions and cognitive structure when new evidence comes along.

Black swan

swan: In the usage of Nassim Nicholas Taleb, a black swan is a rare event whose magnitude is so high as to impact the average of a series, aseries. These are characteristic of 'fat-tailed' distributions, as opposed to the thin-tailed distributions such as the normal distributionindistribution, in which rare events are vanishinglytoo unlikely to have such a large impact.

:Roman Empire chariot-racing teams that became part of politics. Used in place of real party names. See Mind-killer.killer.

: Coherent Extrapolated Volition

", "In poetic terms, our coherent extrapolated volition is our wish if we knew more, thought faster, were more the people we wished we were, had grown up farther together; where the extrapolation converges rather than diverges, where our wishes cohere rather than interfere; extrapolated as we wish that extrapolated, interpreted as we wish that interpreted."

"

Clever arguer

arguer:Someone skilled at writing convincing-sounding arguments for an existing belief. Inventing clever arguments for a belief does not change the truth value of the belief.

Discussed in ""The Bottom Line"Line"

: A moral theory that places value on the consequences of actions. Covered in more depth here.

: What to have when you may have been quite wrong for a long time.

:Rhetorical techniques crafted to exploit human cognitive biases. Considered bad behaviour even if the belief you want to communicate is good.

Deontology/ deontological ethics

Deontological ethics (from Greek deon, "obligation, duty"; and -logia) is anethics: An approach to ethics that judges the morality of an action based on the action'action's adherence to a rule or rules. See Wikipedia article on deontological ethics for more. Contrast consequentialism.

EEA

EEA: Environment of Evolutionary Adaptedness

Adaptedness. An Evolutionary psychology term synonymous with the more commonly-used "ancestral environment""ancestral environment". For humans, refers to the state of tribal bands of hunter-gatherers.

Egan'Egan's law

": "It all adds up to normality."" Surprising truths do not make the sky orange and grey; it stays blue.

: Edited To Add (though some would rather you say "Edit:""Edit:" instead)

: Eliezer Yudkowsky

FAI

FAI: Friendly AI

: Onomatopoetic vernacular for an intelligence explosion.

: An...

Read More (870 more words)
Created by MrHen at 4y

Black swan

In the usage of Nassim Nicholas Taleb, a black swan is a rare event whose magnitude is so high as to impact the average of a series, a characteristic of fat-tailed distributions, as opposed to the thin-tailed distributions such as the normal distributionin which rare events are vanishingly unlikely to have such a large impact.

A term for the opposite of a Straw ManMan: the strongest possible form of an opponent's argument, even if they didn't make it themself.

Strong man

Another term for the opposite of a Straw Man: the strongest actual form of an opponents argument.

Weak Man

The opposite of a Strong Man, and relative to a Straw Man: the weakest version of your opponents actual arguments.

The secret technical codeword that cognitive scientists use to mean "rational". Right up there1. A theory of probability based on updating subjective estimates in the light of new evidence; contrasted with cognitive bias as an absolutely fundamental concept on Less Wrong.the more objective frequentist approach.

2. Probablistic reasoning in general.

3. Good, well done reasoning in general.

Noncentral fallacy

A rhetorical move often used in political, philosophical, and cultural arguments. "X is in a category whose archetypal member gives us a certain emotional reaction. Therefore, we should apply that emotional reaction to X, even though it is not a central category member."

EY

Eliezer Yudkowsky

IMO/IMHO

In my (humble) opinion

MWI

Many-Worlds Interpretation, an interpretation of quantum mechanics advocated in Eliezer Yudkowsky's quantum mechanics sequence

NPC

Non-Player Character (think of an MMORPG like World of Warcraft that has characters controlled by humans and characters controlled by a computer; the characters controlled by humans would be PCs (player characters) and the characters controlled by a computer would be NPCs (non-player characters))

PC

Player Character (think of an MMORPG like World of Warcraft that has characters controlled by humans and characters controlled by a computer; the characters controlled by humans would be PCs (player characters) and the characters controlled by a computer would be NPCs (non-player characters))

Affect

Mood or emotion as demonstrated in external physical signs.

Signaling

Conveying information by performing an action which would be costly to perform, if the information were not true.

Steel Man

A term for the opposite of a Straw Man