Theism is often a default test of irrationality on Less Wrong, but I propose that global warming denial would make a much better candidate.

Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems. Indeed, someone with a mild case of theism can have the same anticipations as someone without, and update their evidence in the same way. If they have moved their belief beyond refutation, in theory it thus fails to constrain their anticipations at all; and often this is the case in practice.

Contrast that with someone who denies the existence of anthropogenic global warming (AGW). This has all the signs of hypothesis privileging, but also reeks of fake justification, motivated skepticism, massive overconfidence (if they are truly ignorant of the facts of the debate), and simply the raising of politics above rationality. If I knew someone was a global warming skeptic, then I would expect them to be wrong in their beliefs and their anticipations, and to refuse to update when evidence worked against them. I would expect their judgement to be much more impaired than a theist's.

Of course, reverse stupidity isn't intelligence: simply because one accepts AGW, doesn't make one more rational. I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality. But if someone is in a milieu that discouraged belief in AGW (one stereotype being heavily Republican areas of the US) and has risen above this, then kudos to them: their acceptance of AGW is indeed a sign of rationality.

Here's the main thing that bothers me about this debate. There's a set of many different questions involving the degree of past and current warming, the degree to which such warming should be attributed to humans, the degree to which future emissions would cause more warming, the degree to which future emissions will happen given different assumptions, what good and bad effects future warming can be expected to have at different times and given what assumptions (specifically, what probability we should assign to catastrophic and even existential-risk damage), what policies will mitigate the problem how much and at what cost, how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad, and how much trust we should put in different aspects of the process that produced the standard answers to these questions and alternatives to the standard answers. These are questions that empirical evidence, theory, and scientific authority bear on to different degrees, and a LessWronger ought to separate them out as a matter of habit, and yet even here some vague combination of all these questions tends to get mashed together into a vague question of whether to believe "the global warming consensus" or "the pro-global warming side", to the point where when Stuart says some class of people is more irrational than theists, I have no idea if he's talking about me. If the original post had said something like, "everyone whose median estimate of climate sensitivity to doubled CO2 is lower than 2 degrees Celsius is more irrational than theists", I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.

I really like this place. What a relief to have a cogent and rational comment about the global warming debate, and how encouraging to see it lavished with a pile of karma.

And if we're going to talk on the level of tribes anyway then at least use reasoning like this.

Very nice reasoning in that post. But this is Less Wrong! We're aiming for the truth, not for some complicated political position.

Wouldn't denial of AGW equate to one of the following beliefs?

  1. Climate sensitivity to doubled CO2 is zero, less than zero, or so poorly defined that it could straddle either side of zero, depending on the precise definition.
  2. Increased levels of CO2 have nothing to do with human activity.

Anyone who believes in ~1 and ~2 must believe in some degree of AGW, even if they further believe it is trivial, or masked by natural climate variations.

Belief that sensitivity is below 2 degrees doesn't seem utterly unreasonable, given that the typical IPCC estimate is 3 degrees +/- 1 degree, the confidence interval is not more than 2 sigma either way ("likely" rather than "very likely" in IPCC parlance) and building that confidence interval involves conditioning on lots of different sorts of evidence. Belief that sensitivity is below 1 degree does seem like having an axe to grind.

All this is Charney or "fast feedback" sensitivity. The biggest concern is the growing evidence that ultimate (slow feedback) sensitivity is much bigger than Charney (at least 30% bigger, and plausibly 100% bigger). Also, that there are carbon cycle and other GHG feedbacks (like methane), so the long-run impact of AGW includes much more than our own CO2 emissions. Multiplying all the new factors together tuns a central estimate of 3 degrees into a central estimate of more than 6 degrees, and then things really do look very worrying indeed (temperatures during the last ice age were only 5-6 degrees less than today; at temperatures 6 degrees more than today there have been no polar ice caps at all).

how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad

Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases - the correction has to be very small as our ability to reliably estimate these factors is very poor).

I might still complain about it falling afoul of anti-politics norms, but at least it would help create the impression that the debate was about ideas rather than tribes.

Excessive anti-politics norms are a problem here - because the issue has become tribalised, we're no longer willing to defend the rational position, or we caveat it far too much.

Unless we [...] have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs.

Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)

Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don't believe that their ability to estimate systematic bias is in fact "very poor", or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at least if we're no longer just looking at relatively narrow and technical questions like attribution and sensitivity but also at broader questions like policy, where expert consensus becomes harder to characterize and many different fields become relevant (including futurism and rational aggregation of evidence and weighing of considerations, which many LessWrongers are probably better at than most domain experts) the possibility that they're right surely is not so preposterous that we can hold it up as a stronger rationality test than theism.

You're right of course - having policy niggles or disagreement is not a good sign of irrationality. But the harder the science gets, the more disagreement becomes irrational. And I've seen people cycle through "global warming isn't happening" to "it's happening but it's natural" to "it's man-made but it'll be too expensive to do anything about it" in the course of a single conversation, without seeming to realise the contradications (I've seen theists do the same, but this was worse).

So yes, mild anti-AGW (or anti-certain AGW policy ideas) is not a strong sign of irrationality, but I'd argue that neither is mild theism.

The irrational thing, and I see it often, is people who believe "nothing should be done about global warming and therefore at least one of the questions above has an answer of 'none'". Obviously, they don't use those words. But when someone switches freely from "the earth isn't getting warmer" to "the fact that the earth is getting warmer is part of a natural climate cycle" there's something wrong with that person's thinking.

Contrast that with someone who denies the existence of anthropogenic global warming (AGW)

I don't have the knowledge of climatology to make a reasoned claim about AGW myself one way or another. Whether I believe or disbelieve in AGW, it would therefore currently have to be completely done based on trusting the positions of other people. Which are indeed Bayesian evidence, but "mistrusting the current climatological elite" even if someone places a wrong prior on how likely said climatological elite is to manufacture/misinterpret data, is not remotely similar to the same sort of logical hoops that your average theist has to go through to explain and excuse the presence of evil in the world, the silence of the gods, the lack of material evidence, archaelogical and geological discrepancies with their holy texts etc, etc, etc.

So your test isn't remotely as good. It effectively tests just one thing: one's prior on how likely climatologists are to lie or misinterpret data.

It effectively tests just one thing: one's prior on how likely climatologists are to lie or misinterpret data.

People don't start out with a high/low claimed prior on lying climatologists and then decide to start arguing about global warming on the internet - it's vice versa, in most cases. The end result tells you about this whole causal history, which includes a fair bit of irrationality along the way.

Of course, where the causal chain terminates is often in stuff like "my parents had political view X," which we don't particularly want to learn about, and thus has to be controlled for if we want to learn about the intermediate irrationality.

One might argue that a typical theist's knowledge of the lack of material evidence for his religion is also pure hearsay. Neither most theists nor most atheists personally investigated the relevant archaeological artefacts. Similarly, few western theists directly experienced things commonly believed to be extremely evil (holocaust, famines, ). They are simply "mistrusting the current archaeological/historical" elite.

edit:

Yes, yes, of course virtually no real theists (or even agnostics) use the "it's all hearsay, I'm merely sceptical" defence. And indeed, large swaths of theology deal with virtually all problems you could think of. I was merely pointing out that an individual theist could, in principle, use a similar defence to the one ArisKatsaris was using.

Religious people aren't generally skeptical that terrible things happen to a lot of people on a very large scale. A large part of the problem of theodicy is constructing explanations for this. You may have more of a point in regards to archaeology, but by and large most of these issues are pretty accessible (certainly more accessible to lay people than complicated climate models).

Endorsing notions like "global warming is a better test of irrationality than theism" is a better test of irrationality than theism. More generally, engagement in vague tribal politics is a better test of irrationality than any object level belief. Liquor is quicker but meta is betta! Meta meta meta meta... MEH TAH! Sing it with me now!

What about endorsing notions like "Endorsing notions like 'global warming is a better test of irrationality than theism' is a better test of irrationality than theism"?

I think it's a sign of rationality, at least in context. Angels would likely consider it insufficiently meta, but man's intellect is bounded.

Heh, fair enough, I guess I don't actually have that good a model of angels. I'll replace "of course" with "likely".

This may be connected to a more general problem: One is trying to extrapolate on to a continuum of how rational people can be by referencing a single bit. Whether that bit is theism or AGW, that's still not going to be that helpful. More bits of data is better.

Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems.

All of these are small problems when they come up only in a narrow context. How often does someone who privileges the hypothesis only do so in a single context?

I think this a good argument for collecting more points that Less Wrongers can use in real life to guage someone's rationality. I like to bring up Newcomb's problem and ask for reasons for their choice, and if they're two-boxers I try to persuade them to one-box. One intelligent friend was quickly persuaded to one-box when I outlined the expected results, whereas another person eventually said "I think you should just go with your instincts". I felt that gave me a lot of information about their thinking, but more points to bring up would be good.

It would be especially good to find contrarian beliefs to ask about for different groups so as to more easily spot people who can think outside their group norm.

I think this a good argument for collecting more points that Less Wrongers can use in real life to guage someone's rationality.

Mmmm. Trying to pick out rationality litmus tests seems like the kind of project EY was talking about in "The Correct Contrarian Cluster" and "Undiscriminating Skepticism".

I don't know how feasible this is, ultimately. The closest test I can think of is probably the Cognitive Reflection Test. (Which has the advantage of being a trio of little arithmetic brainteasers rather than anything that'll trigger people's politics detectors.)

As long as we're mindkilling let's use whether someone's a republican or a democrat to gauge their rationality!

Good idea. Let's ask for their political party so we can control for it in our prior probabilities of global warming wackery.

Let's ask for their political party so we can control for it in our prior probabilities of global warming wackery.

Well, it's somewhat (weakly) relevant; someone deeply embedded in an anti-AGW tribe who parrots the standard line without thinking too much about it, is being less rational than someone in a pro-AGW who comes out strongly against their tribal position.

At the risk of starting a mind-killer war :-) has someone done any actual studies on this? I remember a pair of studies that seemed to show that both sides were equally ignorant in economic matters (though ignorant about different things, always supporting their own position), but can't find it right now.

http://lesswrong.com/lw/9n/the_uniquely_awful_example_of_theism/

Tests which were proposed in the comments include whether a person favours legalization of marijuana, and whether they believe in astrology. (Well, the one about marijuana also includes value judgements: two perfectly rational agents with identical priors and access to the same evidence would agree about the possible effects of marijuana legalization but disagree about whether they're good or bad because of different utility functions.)

I'm thinking about this, and right now I think belief in astrology is the best test:

  • Theism correlates with where and when you grew up more than anything else. (ISTM that in Italy, people from the former Kingdom of the Two Sicilies are far more likely to be religious than people from the former Papal States; and I think that in the Republic of Ireland younger people are less likely to be religious than older people. More generally, there are many more atheists in Europe than elsewhere.)
  • As for anthropogenic global warming, I just don't think the typical person has encountered enough evidence (of the kind they can understand) to have strong grounds to decide one way or the other, so different beliefs will mostly be due to different priors and/or motivated cognition, the former telling us nothing and the latter telling us the person's political affiliation more than anything else.
  • In the case of marijuana legalization (apart from the issue of value judgements), what I see confuses me: ISTM that most people above a certain age are against it and most people below a certain age (except politically right-wing ones) are in favour of it, but that would mean that either 1) support for marijuana legalization advances one funeral at a time, and hence ought to be larger now than 15 years ago than 30 years ago than 45 years ago, or 2) most people change their minds at a certain point in their lives, neither of which I've observed. I suspect that there's some kind of selection bias in the young people I know and the old people I know. (Also, signalling probably plays a helluva part in this, so maybe the old people who claim they never supported marijuana legalization are just lying.)
  • I can think of no social, geographical, or political factor which would substantially correlate with bel