This is part of a semi-monthly reading group on Eliezer Yudkowsky's ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.
Welcome to the Rationality reading group. This fortnight we discuss Part J: Death Spirals (pp. 409-494). This post summarizes each article of the sequence, linking to the original LessWrong post where available.
J. Death Spirals
100. The Affect Heuristic - Positive and negative emotional impressions exert a greater effect on many decisions than does rational analysis.
101. Evaluability (and Cheap Holiday Shopping) - It's difficult for humans to evaluate an option except in comparison to other options. Poor decisions result when a poor category for comparison is used. Includes an application for cheap gift-shopping.
102. Unbounded Scales, Huge Jury Awards, and Futurism - Without a metric for comparison, estimates of, e.g., what sorts of punitive damages should be awarded, or when some future advance will happen, vary widely simply due to the lack of a scale.
103. The Halo Effect - Positive qualities seem to correlate with each other, whether or not they actually do.
104. Superhero Bias - It is better to risk your life to save 200 people than to save 3. But someone who risks their life to save 3 people is revealing a more altruistic nature than someone risking their life to save 200. And yet comic books are written about heroes who save 200 innocent schoolchildren, and not police officers saving three prostitutes.
105. Mere Messiahs - John Perry, an extropian and a transhumanist, died when the north tower of the World Trade Center fell. He knew he was risking his existence to save other people, and he had hope that he might be able to avoid death, but he still helped them. This takes far more courage than someone who dies, expecting to be rewarded in an afterlife for their virtue.
106. Affective Death Spirals - Human beings can fall into a feedback loop around something that they hold dear. Every situation they consider, they use their great idea to explain. Because their great idea explained this situation, it now gains weight. Therefore, they should use it to explain more situations. This loop can continue, until they believe Belgium controls the US banking system, or that they can use an invisible blue spirit force to locate parking spots.
107. Resist the Happy Death Spiral - You can avoid a Happy Death Spiral by (1) splitting the Great Idea into parts (2) treating every additional detail as burdensome (3) thinking about the specifics of the causal chain instead of the good or bad feelings (4) not rehearsing evidence (5) not adding happiness from claims that "you can't prove are wrong"; but not by (6) refusing to admire anything too much (7) conducting a biased search for negative points until you feel unhappy again (8) forcibly shoving an idea into a safe box.
108. Uncritical Supercriticality - One of the most dangerous mistakes that a human being with human psychology can make, is to begin thinking that any argument against their favorite idea must be wrong, because it is against their favorite idea. Alternatively, they could think that any argument that supports their favorite idea must be right. This failure of reasoning has led to massive amounts of suffering and death in world history.
109. Evaporative Cooling of Group Beliefs - When a cult encounters a blow to their own beliefs (a prediction fails to come true, their leader is caught in a scandal, etc) the cult will often become more fanatical. In the immediate aftermath, the cult members that leave will be the ones who were previously the voice of opposition, skepticism, and moderation. Without those members, the cult will slide further in the direction of fanaticism.
110. When None Dare Urge Restraint - The dark mirror to the happy death spiral is the spiral of hate. When everyone looks good for attacking someone, and anyone who disagrees with any attack must be a sympathizer to the enemy, the results are usually awful. It is too dangerous for there to be anyone in the world that we would prefer to say negative things about, over saying accurate things about.
111. The Robbers Cave Experiment - The Robbers Cave Experiment, by Sherif, Harvey, White, Hood, and Sherif (1954/1961), was designed to investigate the causes and remedies of problems between groups. Twenty-two middle school aged boys were divided into two groups and placed in a summer camp. From the first time the groups learned of each other's existence, a brutal rivalry was started. The only way the counselors managed to bring the groups together was by giving the two groups a common enemy. Any resemblance to modern politics is just your imagination.
112. Every Cause Wants to Be a Cult - The genetic fallacy seems like a strange kind of fallacy. The problem is that the original justification for a belief does not always equal the sum of all the evidence that we currently have available. But, on the other hand, it is very easy for people to still believe untruths from a source that they have since rejected.
113. Guardians of the Truth - There is an enormous psychological difference between believing that you absolutely, certainly, have the truth, versus trying to discover the truth. If you believe that you have the truth, and that it must be protected from heretics, torture and murder follow. Alternatively, if you believe that you are close to the truth, but perhaps not there yet, someone who disagrees with you is simply wrong, not a mortal enemy.
114. Guardians of the Gene Pool - It is a common misconception that the Nazis wanted their eugenics program to create a new breed of supermen. In fact, they wanted to breed back to the archetypal Nordic man. They located their ideals in the past, which is a counterintuitive idea for many of us.
115. Guardians of Ayn Rand - Ayn Rand, the leader of the Objectivists, praised reason and rationality. The group she created became a cult. Praising rationality does not provide immunity to the human trend towards cultishness.
116. Two Cult Koans - Two Koans about individuals concerned that they may have joined a cult.
117. Asch's Conformity Experiment - The unanimous agreement of surrounding others can make subjects disbelieve (or at least, fail to report) what's right before their eyes. The addition of just one dissenter is enough to dramatically reduce the rates of improper conformity.
118. On Expressing Your Concerns - A way of breaking the conformity effect in some cases.
119. Lonely Dissent - Joining a revolution does take courage, but it is something that humans can reliably do. It is comparatively more difficult to risk death. But it is more difficult than either of these to be the first person in a rebellion. To be the only one who is saying something different. That doesn't feel like going to school in black. It feels like going to school in a clown suit.
120. Cultish Countercultishness - People often nervously ask, "This isn't a cult, is it?" when encountering a group that thinks something weird. There are many reasons why this question doesn't make sense. For one thing, if you really were a member of a cult, you would not say so. Instead, what you should do when considering whether or not to join a group, is consider the details of the group itself. Is their reasoning sound? Do they do awful things to their members?
This has been a collection of notes on the assigned sequence for this fortnight. The most important part of the reading group though is discussion, which is in the comments section. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!
The next reading will cover Part K: Letting Go (pp. 497-532). The discussion will go live on Wednesday, 7 October 2015, right here on the discussion forum of LessWrong.
The first time I read the sequences, this struck me as unobjectionable and yet another reason to look down on the Nazis. But in reading Sapiens, I realized that I didn't actually have any first or second-hand evidence on which to base that impression, and that bothered me when reading R:AZ.
Does anyone know of any good works on what the Nazis actually believed when it comes to genetics and the modification of populations? I'd prefer second-hand sources, because I'd rather trust a Nazi historian on what Nazis believed than any individual Nazi, but I'm willing to read primary sources if there are some that are deeply relevant.
I think a good question is "Should we expect the beliefs of all self-identifying Nazis to converge?" Sure, the government was a dictatorship and there was a lot of propaganda produced by a central authority, so we might expect Nazi beliefs to be more homogeneous than those of other cultures. But I remember reading a LW comment once about surveys conducted by the Catholic Church during the Middle Ages/Renaissance on Catholics in semi-remote locations, and they found that each community had idiosyncratic variations of Catholicism, incorporating everything from polytheism to animism.
A good proxy might be the beliefs of Adolf Hitler, which are more well-known, although not entirely known. A quick perusal of the Wikipedia article on Nazi eugenics indicates that he idealized the Greek city-state Sparta and heralded it as a historical example of a state with pro-eugenics policies. I suppose that how good of a proxy Hitler's beliefs are would depend on how much and how well he expressed them in public.
On the other hand, the German term 'Tausendjähriges' ('Thousand-year Reich') was also popular at the time, which might indicate an idealization of the future rather than the past.
Another thing is that those are the only things I've seen that seem to indicate that Nazi beliefs were even particularly timeful. Discussion of the past often seems to be related to relating modern people of various ethnicities to the geographical locations of their ancestors. I see talk about which ethnicities supposedly have the best hereditary characteristics, but nothing about breeding modern people into some approximation of individuals in their genetic past, or harking back to some past Golden Age.
I recall that my father read a book on this issue. I'll ask him about it next time we talk and relay the information to you. Remind me if I do not do so.
My father said he read a review of a book on this subject posted on r/history on reddit. He did not read the book. He said it might have been written by a British author. Not much help, but this is what I learned.