Self-skepticism: the first principle of rationality

I did once suggest a similar heuristic; but I feel the need to point out that there are many people in this world with track records of achievement, including, like, Mitt Romney or something, and that the heuristic is supposed to be, "Pay attention to rationalists with track records outside rationality", e.g. Dawkins and Feynman.

Mitt Romney strikes me as a fairly poor example, since from my knowledge of his pre-political life, he seems like a strong rationalist. He looks much better on the instrumental rationality side than the epistemic rationality side, but I think I would rather hang out with Mormon management consultants than atheist waiters. (At least, I think I have more to learn from the former than the latter.)

3DaFranker7yI fail to see how finding more already-rationalists with a track record would benefit LW specifically*, unless those individuals are public figures of some renown that can attract public attention to LW and related organisations or can directly contribute content, insight and training methods. Perhaps I'm just missing some evidence here, but my priors place the usefulness of already-rationalists within the same error margin as non-rationalists who are public figures that would bother to read / post on LW. Paying attention to (rationalists with track records outside rationality)** seems like it would be mostly useful for demonstrating to aware but uninterested/unconvinced people that training rationality and "raising the sanity waterline" are effective strategies that do have real-world usefulness outside "philosophical"*** word problems. * Any more than, say, anyone else or people with any visible track record who are also public figures. ** Perhaps someone could coin a term for this? It seems like a personspace subgroup relevant enough to have a less annoying label. Perhaps something playing on Beisutsukai or a variation of the Masked Hero imagery? *** Used here in the layman's definition of "philosophical": airy, cloud-head, idealist, based on pretty assumptions and "clean" models where everything just works the way it's "supposed to" rather than how-things-are-in-real-life. AKA the "Philosophy is a stupid waste of time" view.
3David_Gerard7ySee, even as no fan of his whatsoever, I suspect Mitt Romney is a very smart fellow I would be foolish to pay no heed to in the general case, and who probably has a fair bit of tried and tested knowledge he's gained in the pursuit of thinking about thinking. Even given qualms I have about the quality of some things he's been quoted as saying of late, but then presidential campaigns select for bullshit.

Self-skepticism: the first principle of rationality

by aaronsw 2 min read6th Aug 2012106 comments


When Richard Feynman started investigating irrationality in the 1970s, he quickly begun to realize the problem wasn't limited to the obvious irrationalists.

Uri Geller claimed he could bend keys with his mind. But was he really any different from the academics who insisted their special techniques could teach children to read? Both failed the crucial scientific test of skeptical experiment: Geller's keys failed to bend in Feynman's hands; outside tests showed the new techniques only caused reading scores to go down.

What mattered was not how smart the people were, or whether they wore lab coats or used long words, but whether they followed what he concluded was the crucial principle of truly scientific thought: "a kind of utter honesty--a kind of leaning over backwards" to prove yourself wrong. In a word: self-skepticism.

As Feynman wrote, "The first principle is that you must not fool yourself -- and you are the easiest person to fool." Our beliefs always seem correct to us -- after all, that's why they're our beliefs -- so we have to work extra-hard to try to prove them wrong. This means constantly looking for ways to test them against reality and to think of reasons our tests might be insufficient.

When I think of the most rational people I know, it's this quality of theirs that's most pronounced. They are constantly trying to prove themselves wrong -- they attack their beliefs with everything they can find and when they run out of weapons they go out and search for more. The result is that by the time I come around, they not only acknowledge all my criticisms but propose several more I hadn't even thought of.

And when I think of the least rational people I know, what's striking is how they do the exact opposite: instead of viciously attacking their beliefs, they try desperately to defend them. They too have responses to all my critiques, but instead of acknowledging and agreeing, they viciously attack my critique so it never touches their precious belief.

Since these two can be hard to distinguish, it's best to look at some examples. The Cochrane Collaboration argues that support from hospital nurses may be helpful in getting people to quit smoking. How do they know that? you might ask. Well, they found this was the result from doing a meta-analysis of 31 different studies. But maybe they chose a biased selection of studies? Well, they systematically searched "MEDLINE, EMBASE and PsycINFO [along with] hand searching of specialist journals, conference proceedings, and reference lists of previous trials and overviews." But did the studies they pick suffer from selection bias? Well, they searched for that -- along with three other kinds of systematic bias. And so on. But even after all this careful work, they still only are confident enough to conclude "the results…support a modest but positive effect…with caution … these meta-analysis findings need to be interpreted carefully in light of the methodological limitations".

Compare this to the Heritage Foundation's argument for the bipartisan Wyden–Ryan premium support plan. Their report also discusses lots of objections to the proposal, but confidently knocks down each one: "this analysis relies on two highly implausible assumptions ... All these predictions were dead wrong. ... this perspective completely ignores the history of Medicare" Their conclusion is similarly confident: "The arguments used by opponents of premium support are weak and flawed." Apparently there's just not a single reason to be cautious about their enormous government policy proposal!

Now, of course, the Cochrane authors might be secretly quite confident and the Heritage Foundation might be wringing their hands with self-skepticism behind-the-scenes. But let's imagine for a moment that these aren't just reportes intended to persuade others of a belief and instead accurate portrayals of how these two different groups approached the question. Now ask: which style of thinking is more likely to lead the authors to the right answer? Which attitude seems more like Richard Feynman? Which seems more like Uri Geller?