This is part of a semi-monthly reading group on Eliezer Yudkowsky's ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.

Welcome to the Rationality reading group. This fortnight we discuss Part I: Seeing with Fresh Eyes (pp. 365-406)This post summarizes each article of the sequence, linking to the original LessWrong post where available.

I. Seeing with Fresh Eyes

87. Anchoring and Adjustment - Exposure to numbers affects guesses on estimation problems by anchoring your mind to an given estimate, even if it's wildly off base. Be aware of the effect random numbers have on your estimation ability.

88. Priming and Contamination - Contamination by Priming is a problem that relates to the process of implicitly introducing the facts in the attended data set. When you are primed with a concept, the facts related to that concept come to mind easier. As a result, the data set selected by your mind becomes tilted towards the elements related to that concept, even if it has no relation to the question you are trying to answer. Your thinking becomes contaminated, shifted in a particular direction. The data set in your focus of attention becomes less representative of the phenomenon you are trying to model, and more representative of the concepts you were primed with.

89. Do We Believe Everything We're Told - Some experiments on priming suggest that mere exposure to a view is enough to get one to passively accept it, at least until it is specifically rejected.

90. Cached Thoughts - Brains are slow. They need to cache as much as they can. They store answers to questions, so that no new thought is required to answer. Answers copied from others can end up in your head without you ever examining them closely. This makes you say things that you'd never believe if you thought them through. So examine your cached thoughts! Are they true?

91. The "Outside the Box" Box - When asked to think creatively there's always a cached thought that you can fall into. To be truly creative you must avoid the cached thought. Think something actually new, not something that you heard was the latest innovation. Striving for novelty for novelty's sake is futile, instead you must aim to be optimal. People who strive to discover truth or to invent good designs, may in the course of time attain creativity.

92. Original Seeing - One way to fight cached patterns of thought is to focus on precise concepts.

93. Stranger Than History - Imagine trying to explain quantum physics, the internet, or any other aspect of modern society to people from 1900. Technology and culture change so quickly that our civilization would be unrecognizable to people 100 years ago; what will the world look like 100 years from now?

94. The Logical Fallacy of Generalization from Fictional Evidence - The Logical Fallacy of Generalization from Fictional Evidence consists in drawing the real-world conclusions based on statements invented and selected for the purpose of writing fiction. The data set is not at all representative of the real world, and in particular of whatever real-world phenomenon you need to understand to answer your real-world question. Considering this data set leads to an inadequate model, and inadequate answers.

95. The Virtue of Narrowness - One way to fight cached patterns of thought is to focus on precise concepts.

96. How to Seem (and be) Deep - To seem deep, find coherent but unusual beliefs, and concentrate on explaining them well. To be deep, you actually have to think for yourself.

97. We Change Our Minds Less Often Than We Think - We all change our minds occasionally, but we don't constantly, honestly reevaluate every decision and course of action. Once you think you believe something, the chances are good that you already do, for better or worse.

98. Hold Off On Proposing Solutions - Proposing solutions prematurely is dangerous, because it introduces weak conclusions in the pool of the facts you are considering, and as a result the data set you think about becomes weaker, overly tilted towards premature conclusions that are likely to be wrong, that are less representative of the phenomenon you are trying to model than the initial facts you started from, before coming up with the premature conclusions.

99. The Genetic Fallacy - The genetic fallacy seems like a strange kind of fallacy. The problem is that the original justification for a belief does not always equal the sum of all the evidence that we currently have available. But, on the other hand, it is very easy for people to still believe untruths from a source that they have since rejected.


This has been a collection of notes on the assigned sequence for this fortnight. The most important part of the reading group though is discussion, which is in the comments section. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!

The next reading will cover Part J: Death Spirals (pp. 409-494). The discussion will go live on Wednesday, 23 September 2015, right here on the discussion forum of LessWrong.


New Comment

New to LessWrong?