Damon Runyon clearly has not considered point spreads.
"Many beliefs about procedure are exactly the opposite-- take believing that truth can be taken from the Bible. That procedure is self-justifying and there is no way to dispute it from within the assumptions of the procedure."
That's my point about rationality - the way I think about it, it would catch its own contradictions. In essence, a rationalist would recognize that rationalists don't "win." So as a result, committing yourself to rationality doesn't actually commit you to an outcome, as perhaps following a scripture would.
The big... (read more)
I'm inclined to agree on your latter point: looking at the results of the survey, it seems like it would be easy to go from 'rationalist' as a procedural label to 'rationalist' as shorthand for 'atheist male computer programmer using bayesian rules.' Of course, that's a common bias, and I think this community is as ready as any to fight it.
As for the former, I tried to address that by pointing out that rationalism means that we've already decided that updating priors is more effective than prayer. That said, I have a perhaps idealistic view of rationality... (read more)
I think the danger here is far smaller than people are making it out to be. There is a major difference between the label "rationalist" and most other identities as Paul Graham refers to them. The difference is that "rationalist" is a procedural label; most identities are at least partially substantive, using procedural/substantive in the sense that the legal system does.
"Rationalist," which I agree is an inevitable shorthand that emerges when the topic of overcoming bias is discussed frequently, is exclusively a procedural la... (read more)
Exposing yourself to any judgments, period, is risky. The OB crowd is perhaps the best-commenting community I've come across: they read previous comments and engage the arguments made there. How many other bloggers are like Robin Hanson and consistently read and reply to comments?
Anyway, as a result, any comment is bound to be read and often responded to by others. There may not have been a point value attached, but judgments were made.
Agreed. I have trouble accepting this as a true irrationality. It strikes me as merely a preference. You lose time you could be listening to song A because of your desire to have the same play count for song B, but this is because you prefer the world where playcounts are equal to the world where they are unequal but you hear a specific song more. Is that really an irrational preference?
I also agree with VN's disclaimer: this time spent [wasted?] on equalizing playcounts could probably be used for something else. But at what point does the preference for a... (read more)
I have to echo orthonormal: information, if processed without bias [availability bias, for example], should improve our decisions, and getting information is not always easy. I don't see how this raises any questions about the rational process, or as you say, principled fashion.
"But by what principled fashion should you choose not to eat the fugu?"
This seems like a situation where the simplest expected value calculation would give you the 'right' answer. In this case, the expected value of eating oysters is 1, the expected value of eating the fug... (read more)
Eliezer Yudkowsky does not sleep. He waits.
You're probably right, but this is a workaround around the question. In law school, they'd accuse you of fighting the hypothetical. You're in the least convenient possible world here: you're wide awake, 100%, for the entire relevant duration.
"I am trying to politely tell you that you have a lot to learn about signaling."
That's why I'm here :)
I think you bring up an interesting point here. I agreed with pwno that, once everyone is aware of a signal, it's no longer credible, especially if it's cheap. But I think you're right as well that for the signals you mentioned, it doesn't matter who knows that it's a signal or how long it's been around.
The distinction, I think, is what one is trying to signal. Signals of conformity to a group or cooperativeness to an ally might be affected dif... (read more)
Good point. I should have made the distinction between status signals and "conformity" signals clearer. But I do think that there are very distinct mechanisms at work there, even though the ultimate end [higher status] is probably the same. [That is, we signal conformity to an employer to get a job that will give us higher status.]
I think you're hitting a different, though related point. A business suit and a smile are probably not credible signals, though their absence is a credible signal of the opposite. it's easy to wear a business suit and fake a smile: each applicant to a job opening will likely come with both. Those that don't are almost instantaneously downgraded. It seems that the signal becomes a new baseline for behavior, and though it doesn't credibly signal anything, its absence signals something.
I'm not positive on the mechanism here: it's probably related to the fact that the signal is so low-cost, and that anyone failing to display it is either extremely low status, or signals some other defect.
Politics or not, I find this to be a great illustration of the real-world consequences of failure of rationality. The interesting question is at what point the mechanism breaks down.
he logical course of action for rich countries is to study the most effective methods of poverty alleviation and development, and apply those. We can see clearly that this is not happening, but it's unclear as to why:
-Are rich countries wrong about the conditions they're facing, and thus using improper methods? If so, is there a bias that causes them to misperceive conditions?
... (read more)
I actually think it's the marginally different "Have you stopped beating your wife?" which allows for yes/no answers only, except that neither will help you.
part 2: "So what an expert rationalist should do to avoid this overconfidence trap?"
Apologies for flooding the comments, but I wanted to separate the ideas so they can be discussed separately. The question is how to avoid overconfidence, and bias in general. Picking up from last time:
If we can identify a bias, presumably we can also identify the optimal outcome that would happen in the absence of such bias. If we can do that, can't we also constrain ourselves in such a way that we can achieve the optimal outcome despite giving in to the bias? Fo... (read more)
"So what an expert rationalist should do to avoid this overconfidence trap?"
You mean, how should one overcome bias? Be less wrong, if you will? You've come to the right place. David Balan had a post that didn't receive enough attention over at OB: http://www.overcomingbias.com/2008/09/correcting-bias.html
This comment roughly paraphrases the points I made there.
If we can identify a bias, presumably we can also identify the optimal outcome that would happen in the absence of such bias. There are two ways to achieve this, and I will post them in se... (read more)
"So what an expert rationalist should do to avoid this overconfidence trap? The seeming answer is that we should rely less on our own reasoning and more on the “wisdom of the crowds."
As Bryan Caplan's "Myth of the Rational Voter" pretty convincingly shows, the wisdom of crowds is of little use when the costs of irrationality are low. It's true in democracy: voting for an irrational policy like tariffs has almost no cost, because a single vote almost never matters. The benefit of feeling good about voting for what you like to believe in ... (read more)
Neither do I, though I'm often tempted to find a reason for why my iPod's shuffle function "chose" a particular song at a particular time. ["Mad World" right now.]
It seems that our mental 'hardware' is very susceptible to agency and causal misfires, leaving an opening for something like religious belief. Robin explained religious activities and beliefs as important in group bonding [http://www.overcomingbias.com/2009/01/why-fiction-lies.html], but the fact that religion arose may just be a historical accident. It's likely that somethin... (read more)
Excellent description. Reminds me a little of Richard Dawkins in "The God Delusion," explaining how otherwise useful brain hardware 'misfires' and leads to religious belief.
You mention agency detection as one of the potential modules that misfire to bring about religious belief. I think we can generalize that a little more and say fairly conclusively that the ability to discern cause-and-effect was favored by natural selection, and given limited mental resources, it certainly favored errors where cause was perceived even if there was none, rath... (read more)
Agency misfires and causal misfires can help to suggest religion. For that suggestion to get past your filters, the sanity waterline has to be low. I don't invent a new religion every time I see a face in the clouds or three dandelions lined up in a row.
We're forgetting signaling. Robin would never forgive us, because he sees it in a lot of things, and I happen to agree with him that it's far more pervasive than people think.
In fact, the Tversky example gives people two opportunities to signal: not only do they get to demonstrate higher pain tolerance [especially important for men], they also get to "demonstrate" a healthier heart. Both should be boosts in status.
The same goes for Calvinists: though, when you think about it, you truly believe in the elect, you don't think about it most of your ... (read more)
I disagree... I think "limited analysis resources" accounts for the very difference you speak of. I think the "rituals of cognition" you mention are themselves subjection to rationality analysis: if I'm understanding you correctly, you are talking about someone who knows how to be rational in theory but cannot implement such theory in practice. I think you run into three possibilities there.
One, the person has insufficient analytical resources to translate their theory into action, which Robin accounts for. The person is still rational,... (read more)
I think we're missing a fairly basic definition of rationality, one that I think most people would intuitively come to. It involves the question at what stage evidence enters the decision-making calculus.
Rationality is a process: it involves making decisions after weighing all available evidence and calculating the ideal response. Relevant information is processed consciously [though see Clarification below] before decision is rendered.
This approach is opposed to a different, less conscious process, which are our instinctive and emotional responses to situ... (read more)