Theism is often a default test of irrationality on Less Wrong, but I propose that global warming denial would make a much better candidate.

Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems. Indeed, someone with a mild case of theism can have the same anticipations as someone without, and update their evidence in the same way. If they have moved their belief beyond refutation, in theory it thus fails to constrain their anticipations at all; and often this is the case in practice.

Contrast that with someone who denies the existence of anthropogenic global warming (AGW). This has all the signs of hypothesis privileging, but also reeks of fake justification, motivated skepticism, massive overconfidence (if they are truly ignorant of the facts of the debate), and simply the raising of politics above rationality. If I knew someone was a global warming skeptic, then I would expect them to be wrong in their beliefs and their anticipations, and to refuse to update when evidence worked against them. I would expect their judgement to be much more impaired than a theist's.

Of course, reverse stupidity isn't intelligence: simply because one accepts AGW, doesn't make one more rational. I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality. But if someone is in a milieu that discouraged belief in AGW (one stereotype being heavily Republican areas of the US) and has risen above this, then kudos to them: their acceptance of AGW is indeed a sign of rationality.

113 comments, sorted by Click to highlight new comments since: Today at 7:47 PM
New Comment
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

Here's the main thing that bothers me about this debate. There's a set of many different questions involving the degree of past and current warming, the degree to which such warming should be attributed to humans, the degree to which future emissions would cause more warming, the degree to which future emissions will happen given different assumptions, what good and bad effects future warming can be expected to have at different times and given what assumptions (specifically, what probability we should assign to catastrophic and even existential-risk damage), what policies will mitigate the problem how much and at what cost, how important the problem is relative to other problems, what ethical theory to use when deciding whether a policy is good or bad, and how much trust we should put in different aspects of the process that produced the standard answers to these questions and alternatives to the standard answers. These are questions that empirical evidence, theory, and scientific authority bear on to different degrees, and a LessWronger ought to separate them out as a matter of habit, and yet even here some vague combination of all these questions tends to get mashed together int... (read more)

I really like this place. What a relief to have a cogent and rational comment about the global warming debate, and how encouraging to see it lavished with a pile of karma.

And if we're going to talk on the level of tribes anyway then at least use reasoning like this [].
Very nice reasoning in that post. But this is Less Wrong! We're aiming for the truth, not for some complicated political position.
Apart from those two issues, the other points you bring up are the domain of experts. Unless we are experts ourselves, or have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs. We can widen the uncertainty somewhat (we can confidently expect overconfidence :-), maybe add a very small systematic bias in one direction (to reflect possible social or political biases - the correction has to be very small as our ability to reliably estimate these factors is very poor). Excessive anti-politics norms are a problem here - because the issue has become tribalised, we're no longer willing to defend the rational position, or we caveat it far too much.

Unless we [...] have strong relevant information about the biases of experts, the rational thing to do is to defer to expert beliefs.

Well, yes, but the very fact that a question has strong ideological implications makes it highly probable that experts are biased about it. (I argued this point at greater length here.)

Presumably most of those whose opinions fall outside of whatever the acceptable range is have those opinions either because they believe they have some relevant piece of expertise, or because they believe they have some relevant information about the biases of specific experts, or because they don't believe that their ability to estimate systematic bias is in fact "very poor", or even because they disagree with you about what the experts think. This seems like the sort of information people might falsely convince themselves that they have, but at least if we're no longer just looking at relatively narrow and technical questions like attribution and sensitivity but also at broader questions like policy, where expert consensus becomes harder to characterize and many different fields become relevant (including futurism and rational aggregation of evidence and weighing of considerations, which many LessWrongers are probably better at than most domain experts) the possibility that they're right surely is not so preposterous that we can hold it up as a stronger rationality test than theism.
You're right of course - having policy niggles or disagreement is not a good sign of irrationality. But the harder the science gets, the more disagreement becomes irrational. And I've seen people cycle through "global warming isn't happening" to "it's happening but it's natural" to "it's man-made but it'll be too expensive to do anything about it" in the course of a single conversation, without seeming to realise the contradications (I've seen theists do the same, but this was worse). So yes, mild anti-AGW (or anti-certain AGW policy ideas) is not a strong sign of irrationality, but I'd argue that neither is mild theism.
The irrational thing, and I see it often, is people who believe "nothing should be done about global warming and therefore at least one of the questions above has an answer of 'none'". Obviously, they don't use those words. But when someone switches freely from "the earth isn't getting warmer" to "the fact that the earth is getting warmer is part of a natural climate cycle" there's something wrong with that person's thinking.
Wouldn't denial of AGW equate to one of the following beliefs? 1. Climate sensitivity to doubled CO2 is zero, less than zero, or so poorly defined that it could straddle either side of zero, depending on the precise definition. 2. Increased levels of CO2 have nothing to do with human activity. Anyone who believes in ~1 and ~2 must believe in some degree of AGW, even if they further believe it is trivial, or masked by natural climate variations. Belief that sensitivity is below 2 degrees doesn't seem utterly unreasonable, given that the typical IPCC estimate is 3 degrees +/- 1 degree, the confidence interval is not more than 2 sigma either way ("likely" rather than "very likely" in IPCC parlance) and building that confidence interval involves conditioning on lots of different sorts of evidence. Belief that sensitivity is below 1 degree does seem like having an axe to grind. All this is Charney or "fast feedback" sensitivity. The biggest concern is the growing evidence that ultimate (slow feedback) sensitivity is much bigger than Charney (at least 30% bigger, and plausibly 100% bigger). Also, that there are carbon cycle and other GHG feedbacks (like methane), so the long-run impact of AGW includes much more than our own CO2 emissions. Multiplying all the new factors together tuns a central estimate of 3 degrees into a central estimate of more than 6 degrees, and then things really do look very worrying indeed (temperatures during the last ice age were only 5-6 degrees less than today; at temperatures 6 degrees more than today there have been no polar ice caps at all).

Contrast that with someone who denies the existence of anthropogenic global warming (AGW)

I don't have the knowledge of climatology to make a reasoned claim about AGW myself one way or another. Whether I believe or disbelieve in AGW, it would therefore currently have to be completely done based on trusting the positions of other people. Which are indeed Bayesian evidence, but "mistrusting the current climatological elite" even if someone places a wrong prior on how likely said climatological elite is to manufacture/misinterpret data, is not remotely similar to the same sort of logical hoops that your average theist has to go through to explain and excuse the presence of evil in the world, the silence of the gods, the lack of material evidence, archaelogical and geological discrepancies with their holy texts etc, etc, etc.

So your test isn't remotely as good. It effectively tests just one thing: one's prior on how likely climatologists are to lie or misinterpret data.

People don't start out with a high/low claimed prior on lying climatologists and then decide to start arguing about global warming on the internet - it's vice versa, in most cases. The end result tells you about this whole causal history, which includes a fair bit of irrationality along the way. Of course, where the causal chain terminates is often in stuff like "my parents had political view X," which we don't particularly want to learn about, and thus has to be controlled for if we want to learn about the intermediate irrationality.
One might argue that a typical theist's knowledge of the lack of material evidence for his religion is also pure hearsay. Neither most theists nor most atheists personally investigated the relevant archaeological artefacts. Similarly, few western theists directly experienced things commonly believed to be extremely evil (holocaust, famines, ). They are simply "mistrusting the current archaeological/historical" elite. edit: Yes, yes, of course virtually no real theists (or even agnostics) use the "it's all hearsay, I'm merely sceptical" defence. And indeed, large swaths of theology deal with virtually all problems you could think of. I was merely pointing out that an individual theist could, in principle, use a similar defence to the one ArisKatsaris was using.
Religious people aren't generally skeptical that terrible things happen to a lot of people on a very large scale. A large part of the problem of theodicy [] is constructing explanations for this. You may have more of a point in regards to archaeology, but by and large most of these issues are pretty accessible (certainly more accessible to lay people than complicated climate models).

Endorsing notions like "global warming is a better test of irrationality than theism" is a better test of irrationality than theism. More generally, engagement in vague tribal politics is a better test of irrationality than any object level belief. Liquor is quicker but meta is betta! Meta meta meta meta... MEH TAH! Sing it with me now!

What about endorsing notions like "Endorsing notions like 'global warming is a better test of irrationality than theism' is a better test of irrationality than theism"?
I think it's a sign of rationality, at least in context. Angels would likely consider it insufficiently meta, but man's intellect is bounded.

That is a decidedly ambitious use of the phrase "of course."

Heh, fair enough, I guess I don't actually have that good a model of angels. I'll replace "of course" with "likely".

This may be connected to a more general problem: One is trying to extrapolate on to a continuum of how rational people can be by referencing a single bit. Whether that bit is theism or AGW, that's still not going to be that helpful. More bits of data is better.

Theism is a symptom of excess compartmentalisation, of not realising that absence of evidence is evidence of absence, of belief in belief, of privileging the hypothesis, and similar failings. But these are not intrinsically huge problems.

All of these are small problems when they come up only in a narrow context. How often does someone who privileges the hypothesis only do so in a single context?

I think this a good argument for collecting more points that Less Wrongers can use in real life to guage someone's rationality. I like to bring up Newcomb's problem and ask for reasons for their choice, and if they're two-boxers I try to persuade them to one-box. One intelligent friend was quickly persuaded to one-box when I outlined the expected results, whereas another person eventually said "I think you should just go with your instincts". I felt that gave me a lot of information about their thinking, but more points to bring up would be good. It would be especially good to find contrarian beliefs to ask about for different groups so as to more easily spot people who can think outside their group norm.
Mmmm. Trying to pick out rationality litmus tests seems like the kind of project EY was talking about in "The Correct Contrarian Cluster []" and " Undiscriminating Skepticism []". I don't know how feasible this is, ultimately. The closest test I can think of is probably the Cognitive Reflection Test []. (Which has the advantage of being a trio of little arithmetic brainteasers rather than anything that'll trigger people's politics detectors.)

As long as we're mindkilling let's use whether someone's a republican or a democrat to gauge their rationality!

Or whether someone's a neonazi!
Good idea. Let's ask for their political party so we can control for it in our prior probabilities of global warming wackery.
Well, it's somewhat (weakly) relevant; someone deeply embedded in an anti-AGW tribe who parrots the standard line without thinking too much about it, is being less rational than someone in a pro-AGW who comes out strongly against their tribal position.
Is an "anti-AGW tribe" one that disbelieves in AGW or one that's opposed to it?
At the risk of starting a mind-killer war :-) has someone done any actual studies on this? I remember a pair of studies that seemed to show that both sides were equally ignorant in economic matters (though ignorant about different things, always supporting their own position), but can't find it right now.
My mad Google-fu turned up [] for you.
Yes, that's what I was thinking of (as well as the follow up, mentioned in [] , where he does the same to the 'other side').
Quite a bit: This article [] discusses some of them, but ignores others. There's some interesting data if one looks at the GSS data which suggests that in the US, self-identified moderates are actually often the least likely to understand basic science [] .
Didn't read the links, but posting to note that I caught myself thinking "that makes sense; they're moderates because they're aware of their ignorance []".
(For posterity: The Two-Party Swindle [] explains one of the key reasons why not.)
[-][anonymous]10y 11

Tests which were proposed in the comments include whether a person favours legalization of marijuana, and whether they believe in astrology. (Well, the one about marijuana also includes value judgements: two perfectly rational agents with identical priors and access to the same evidence would agree about the possible effects of marijuana legalization but disagree about whether they're good or bad because of different utility functions.)

[-][anonymous]10y 10

I'm thinking about this, and right now I think belief in astrology is the best test:

  • Theism correlates with where and when you grew up more than anything else. (ISTM that in Italy, people from the former Kingdom of the Two Sicilies are far more likely to be religious than people from the former Papal States; and I think that in the Republic of Ireland younger people are less likely to be religious than older people. More generally, there are many more atheists in Europe than elsewhere.)
  • As for anthropogenic global warming, I just don't think the typical person has encountered enough evidence (of the kind they can understand) to have strong grounds to decide one way or the other, so different beliefs will mostly be due to different priors and/or motivated cognition, the former telling us nothing and the latter telling us the person's political affiliation more than anything else.
  • In the case of marijuana legalization (apart from the issue of value judgements), what I see confuses me: ISTM that most people above a certain age are against it and most people below a certain age (except politically right-wing ones) are in favour of it, but that would mean that either 1) support for mar
... (read more)
You may find [] interesting although unfortunately the survey data does not include questions about marijuana or drugs in general.
(I haven't finished reading it yet.) Yeah, I had forgotten about population aging, though I'm not sure how big an effect it is. I'd guess the median age (in Italy) has increased between 5 and 30 years in the past 45 years. From the abstract: That's what happens in diachronic linguistics too: when adults change the way they speak, that's usually towards the way younger cohorts speak rather than away from it (just google for Queen vowels). In absence of any population aging, that would only accelerate linguistic changes among the population as a whole.
A common error people make when they see an old vs. young split in opinion is assuming that it must be an age effect rather than a cohort effect. Thank you for avoiding that mistake by noticing that it could be either! (Maybe I could use that as a rationality litmus test, ha ha.) You got me curious! I pulled up a chapter [] from the NSE's latest Science & Engineering Indicators report; it links a spreadsheet [] of survey results from 1979 to 2010 on how scientific US adults think astrology is. (Strictly this isn't the same thing as believing in astrology but I'd expect it to be a fair proxy.) In the 2010 sample, people who were young, female, less educated, or knew fewer science facts were more likely to think astrology was scientific. I should say that this doesn't automatically mean astrology is a worse rationality test than atheism. Atheism itself correlates [] with sex, race, age, and education level, at least in the US.

It would amuse me if there was a sizable population that thought astrology was scientific and rejected it on that basis because they don't trust science.

This is actually similar to the medieval Catholic church's position on astrology [], at least if you understand "scientific" to mean "what passed for scientific during the middle ages".
TheOtherDave []: Eugine_Nier []: What evidence are you aware of that the Church condemned those particular propositions for being "science" (natural philosophy), rather than for being "errors" (falsehoods)?
My point was that the church considered the evidence for the propositions suspect since it was merely "science" (natural philosophy).
I'm pretty sure I understood your point. I was asking for some reasons to think your point is true.
This reminds me of an old priest who pointed out that people who don't believe in God tend to believe in astrology and other superstitions, and said that was because “people have to believe in something or another”. However weird that might look now, I still think that among the demographics he was familiar with (people growing up in a smallish town in Italy in the early 20th century) his observation (about the correlation, not about its cause) was likely not wrong.
I wonder if it ever crossed his mind that "What I believe is equivalent to astrology and other superstitions." Did he just think he was lucky to have slotted the truth into his belief-hole?
Actually, as for this particular issue, a cohort effect was what I used to consider obvious, and I didn't hypothesize an age effect until I looked for a long-term trend and failed to see one. (Maybe I haven't looked in the right places, though.)
Assuming by “young” they mean (say) younger than 18 rather than (say) younger than 50 (and that they're talking about age effects rather than cohort effects), and with the possible exception of gender, that does sound like a description of the groups of people who I'd expect to be less rational. Hence, that doesn't sound like as strong a reason to doubt the effectiveness of belief-in-astrology as a test for rationality as finding that people from South Examplistan are more likely to believe in astrology than people from North Examplistan.
"Young" is my own way of summarising the results for the different age subgroups; I saw no correlation with age for ages >34, but people aged 18-24 thought astrology was more scientific than people aged 25-34, who in turn thought it more scientific than people aged 35+. (The sample had no under-18s, unfortunately.) In any case, astrology seems like a good item to add to a potential list of rationality probes.
This is actually a good test, but not in the way it was intended. Drug laws are not about drugs [], and I expect someone who claims a high level of rationality about political issues to understand this point. So when someone discusses the issue of marijuana prohibition by doing some sort of cost/benefit analysis that takes the purported motives of these laws at face value, I find this to be a miserable failure, no matter what his ultimate conclusions.
I'm not sure the case for marijuana legalization is cut and dried even for pot lovers: "Our neighborhoods continue to complain daily about the disruption and public safety issues presented by medical marijuana businesses operating in the city,'' says [a popular Los Angeles City Councilman]. [] Maybe marijuana really does cause less trouble overall if the law forces you to smoke it on the sly.

'disruption and public safety' sounds like it would be a kind of trouble an order of magnitude or two below trouble like 'the destroying of thousands of lives through courts & prisons'.

How about making penalties for marijuana use lighter?
Point of order - marijuana legalization is not synonymous with the particular legal and regulatory regime for medical marijuana adopted in Los Angeles, CA.
All I said was that the case wasn't cut and dried...

Well, I wonder how global warming view correlates with correct solving of problems involving Bayesian reasoning, and things like monty hall puzzle, as well as bunch of other problems that you get wrong by a fallacy. It may be more correlated than religiosity, in which case it would be a better test. Or it can be less correlated, in which case it would be a worse test. You know, we can test experimentally what is a better test.

When I opened this (already heavily downvoted) thread I was actually expecting to read an argument that belief in Global Warming would be the sign of irrationality.

:-) It's quite interesting - it's one of my most heavily downvoted posts ever!
Congratulations? EDIT: If you're actually excited because you are interested in the response you got and what it might mean, genuine congratulations. Way to care more about learning than about signalling.
Well, it all depends on why it was downvoted. If it was a question of style and tone, I'll take that on board and take more care. If they felt the argument was incorrect, that's also relevant. Since I can't estimate how many downvoted for this reason, I'll use the critiques in the comment section as information. But there's a bit too many downvotes for that. Downvoting from -6 to -7 is generally a sign that you really, really disliked the post (and the comment section didn't show such dislike). I can think of two possible reasons that can explain this level of dislike: maybe many see the post as troll bait/tribal, or maybe there are a few on LW who actually disbelieve AGW. In the first case, I apologise, and should have made it more clear that one of my points was that we shouldn't abandon stating the rational AGW position simply because it's tribal and attracts trolls. In the second case, it's worrying for the less wrong community.
Or that you're a believer in the "vote without reference to existing score" school of thought on karma, which some LWers are.
Possibly. But I'm not convinced this school has much effect on the downward votes (there seems to be a reluctance to move posts below -2 or -3 unless they're very disliked).
I considered downvoting your post after reading steven0461's comment; his basic point is definitely worth keeping in mind. But I decided against it as I think your basic line of thinking was fair, and in fact I probably would've upvoted your post had you * elaborated on specifically which aspects of the AGW hypothesis you'd propose as rationality probes (e.g. whether human activity can raise CO₂ levels in the troposphere, whether tropospheric CO₂ levels have risen over the last x years, etc.) * made the weaker (and easier to defend) claim that AGW denial (with "AGW denial" having been fleshed out as suggested in the previous bullet point) was about as good an irrationality test as theism, rather than a better one I expect the remaining objections to AGW-denial-as-rationality-test would apply just as much to theism-as-rationality-test, in which case it'd still be justified to say the former is as good as the latter. Theism correlates with partisan politics too, and if anything gives fewer bits of information about someone's rationality (being basically a yes-no condition) than AGW-denial-as-rationality-test (which could be a sliding scale). I'm not 100% sure [] that trying to probe rationality in this way is worth the effort, but again, this objection applies as much to theism as AGW denial. [Edit to rephrase "even more binary" in terms of giving less information.]
Fair points.
Downvoting the parent comment to -2 seems pretty churlish to be honest.

IMHO, the best "test of irrationality" would be acceptance of alternative medicine.

It matters little whether you believe in global warming, but belief in homeopathy, faith healing or anything else that makes you to delay the official thing, will make difference in your life, and not for the better.

Given the state of the official thing [], I don't thing that's obvious.
Isn't that under dispute between "believers" and non-"believers" to begin with?

I looked into this issue and found no conclusive evidence of any global warming, let alone AGW or any catastrophic warming trends. Granted, this was several years ago. So where's the evidence? links?

There's an entire climate blogosphere out there, full of people who know more and care more, and I see no reason for people to rehash the debate here.

A global temperature trend data set based on satellite data that I consider reliable is maintained at: [] The data goes back to 1978. The last 6 months or so look like a particularly high variance, low trend period, which leaves me thinking that when the variance dies down, we may see a significant shift in the 5-10 year trend line. The guy has been a skeptic, but has accepted that his data shows a warming trend, though on the lowest end of the UN commission's estimated ranges, coming in at about 0.13C per decade.
I upvoted you (from -1), not because I agree with your conclusion but because you're asking for additional information to inform your decision, which should be celebrated not punished on Less wrong.
Here's an article [] by William Nordhaus, a climate economist often attacked by people like Joe Romm for arguing for a slower path of carbon emissions reduction than others. Key graph [] here. It's hard to do a thorough search and miss such things.
In this article, Nordhaus says that because there is no outright Soviet-style repression against dissenters in the academia, it's absurd to suppose that dissenters might be afraid to speak their mind. Regardless of whether his overall positions about global warming are correct, Nordhaus is being either naive or disingenuous here. Clearly there are many ways in which expressing contrarian opinions might be deadly for one's academic career, and which don't involve any open persecution (or even any open formal condemnation by the official institutions).
Nordhaus's position to me seems to be stronger than you make it out to be. Here's the thing: even in the Soviet repression some academics risked their lives to speak out. You'd expect at least that much speaking out then among academics in the relevant fields when all they have to risk is their academic careers. Yet, in the relevant disciplines, one doesn't see much of any at all. Similarly, if repression of some form were serious, one would expect that the tenure system would cause more people to be free to speak out and one would expect a lot more vocal expressions of dissent from tenured professors than non-tenured faculty, but there doesn't seem to be such a pattern.
Well, this [] is an example that I linked to elsewhere [] in this thread.
Which is why we don't do science by anecdote, or by citing one example. These attempts at measuring consensus seem relevant: [] [] And, of course, one of the main lessons of rationality is that such reports are much much stronger evidence that finding examples of dissident views.
Yes, there's no question that there are individuals who are dissenters. And of course, Lindzen is one of both the most vocal and the most qualified. His existence and positions alone should substantially reduce how confident people are in the claim that AGW is correct in a strong sense. However, I'm not asserting that there aren't any dissenters, just that if the rarity of vocal dissent were caused primarily by suppression of dissent, one would expect more dissent than one sees.

Nordhaus's position to me seems to be stronger than you make it out to be. Here's the thing: even in the Soviet repression some academics risked their lives to speak out. You'd expect at least that much speaking out then among academics in the relevant fields when all they have to risk is their academic careers. Yet, in the relevant disciplines, one doesn't see much of any at all.

The trouble is, the situation is fundamentally different here. If there existed some sort of crude open attempt to dictate official dogma, as in the Soviet Union, I have no doubt that a small but still non-zero minority would speak out against it, no matter what the consequences. However, in the modern academic system, there is no such thing -- rather, there is a complex system of subtle but strong perverse incentives that lead to systematic biases and a gradual drift of the academic mainstream away from reality. (Of course, the magnitude of these problems varies greatly across different fields.)

In this situation, a contrarian is faced with a situation where making fundamental criticism of the state of the field won't invite any open persecution and accusation of heresy, but it will lead to profession... (read more)

In medicine, John Ioannidis has basically built his career around exposing unpleasant truths that the perverse incentives have led the field away from. He has gotten several of his papers to various top journals, is currently a Professor of Medicine at Stanford, and been cited over 30,000 times. Isn't that evidence that you can make fundamental criticisms of the state of the field without sacrificing your career?

My intuition suggests that both in the case of Ioannidis and other somewhat similar cases - such as the WEIRD paper, which seriously questioned the generalizability of pretty much all existing psychological research, and which has been cited almost 300 times since its publication in 2010 - is that when a field is drifting away from reality, most of the people working within the field are quite aware of the fact. When somebody finally makes a clear and persuasive argument about this being the case, everyone will start citing that argument.

I certainly don't deny that the self-correcting mechanism you describe has worked to some extent in some fields in recent past. However, it also seems evident that in certain other fields nothing like that is happening, even though their mainstream has long been drifting far from reality, and the only people making cogent fundamental criticism are outsiders completely out of grace with the establishment. I don't have anything like a complete theory that would explain when correct fundamental criticism will be acclaimed as an important contribution, and when it will trigger a negative career-killing response from the establishment. Now, of course, one possibility is that I have simply acquired crackpot beliefs on several subjects and I'm completely misdiagnosing the situation. Clearly, I would disagree, but examining the problem further would require getting into a complex discussion of each particular subject in question. That said, regarding the specific question of fields that have bearing on the global warming controversies, my current positions are (mainly) ones of confusion and indecision. They are not among the examples of clearly pathological fields that I have in mind. In the context of this thread, I merely want to point out that the arguments such as that advanced by Nordhaus aren't enough to give much certainty about the health of these areas.
This paper [] looks to me like it accurately criticizes a basic and important methodological flaw in some of the climate change literature; my impression is that the authors haven't suffered from it, but that they also haven't been listened to all that much. Note that although Annan disagrees with the more extreme predictions, he also explicitly disagrees with climate skepticism, which helps convince me that skepticism is probably wrong (since he seems pretty reasonable), but which also leaves Vladimir free to argue that an actual skeptic would face greater career risks. Your examples look only questionably relevant to me, because those fields aren't politicized in the same way that climate change is.

(Is there a set of conditions that would convince/enable you to write posts explaining to LessWrong how to engage in meta-level Hansonian/Schellingian analyses similar to the one you did in your comment? Alternatively, do you know of any public fora whose level of general intelligence and "rationality" is greater than LessWrong's? I can't immediately think of any better strategies for raising the sanity waterline than you or Steve Rayhawk writing a series of posts about signaling games, focal points, implicit decision policies, social psychology, &c., and how we should use those concepts when interpreting the social world. But of course I have no idea if that would be a good use of your time or if it'd actually have any noticeable impact. Anyway it seems possible there'd be a way to raise funds to pay you to write at least a few posts, Kickstarter style, or I could try to convince Anna and Julia from the new/upcoming Center for Modern Rationality to write up some grants for you.)

Thanks for the kind words, but I wouldn't be able to allocate enough time for such a project at the present moment. In fact, I've had plans to write something along these lines for quite a while, but original articles take much more time than comments. (And I've barely had any time even for comments in recent months.) Also, realistically, I'm not sure how successful the product would be. I don't have much talent for writing in an engaging way, which is further exacerbated by English not being my native language. So I think that even with the best possible outcome, not very many people would end up reading it.
Just noting that I would certainly read such a thing. You seem to have a knack for insight, regardless of your linguistics (which are anyway, if only in my opinion, quite good).
I don't find this example concrete. I know very little about economics ideology. Can you give more specific examples?
I appreciate someone at least providing some evidence :P However, this article doesn't address the criticism that the temperature graph is flawed/inaccurate as I have seen persuasively argued. I don't have any resources on hand since I looked into this years ago. If you want to make the case that this issue is a rationality "litmus test", then not only should you really be providing some evidence, but you should be showing that the arguments against the evidence are wrong, too. You should be able to make a pretty unequivocal case, right?

I'm going to take Steven's advice below and not recap climate discussion here. However, if you want to do your own research and make a large-stakes bet about persuading some designated neutral judges on the extent of warming in the last 100 years, structured to express the disagreement, I would probably be keen to take it.

There's this [] presentation by Richard Lindzen [] to the British House of Commons explaining why the predictions of catastrophic consequence of global warming are BS.
What did you find when you looked into the issue?

I work in England, in a university environment, so my acceptance of AGW is the default position and not a sign of rationality.

No conformation bias here, I am sure.

Is there a conformation bias present? It is better that you explain it, I guess.
Well, you could be implying that because I've been primed by my environment to believe in AGW, I will therefore believe in AGW? Or is it that because I only see irrational people disbelieving AGW, I think all disbeliefs in AGW are irrational?