Open question on the certain 'hot' global issue of importance to FAI

by Dmytry1 min read23rd Mar 201236 comments

-4

Personal Blog

A question: why anything about global warming gets downvoted, even popularly readable explanation of the fairly mainstream scientific consensus? edit: Okay, this is loaded. I should put it more carefully: why is the warming discussion generally considered inappropriate here? That seems to be the case; and there are pretty good reasons for this. But why can't AGW debate be invoked as example controversy? The disagreement on AGW is pretty damn unproductive, and so it is a good example of argument where productivity may be improved.

The global warming is a pretty damn good reason to build FAI. It's quite seriously possible that we won't be able to do anything else about it. Even mildly superhuman intelligence, though, should be able to eat the problem for breakfast. Even practical sub-human AIs can massively help with the space based efforts to limit this issue (e.g. friendly space-worthy von Neumann machinery would allow to almost immediately solve the problem). We probably will still have extra CO2 in atmosphere, but that is overall probably not a bad thing - it is good for plants.

For that to be important it is sufficient to have 50/50 risk of global warming Even probabilities less than 0.5 for the 'strong' warning scenarios still are a big factor - in terms of 'expected deaths' and 'expected suffering' considering how many humans on this planet lack access to air conditioning. I frankly am surprised that the group of people fascinated with AI would have such a trouble with the warming controversy, as to make it too hot of a topic for an example of highly unproductive arguments.

I do understand that LW does not want political controversies. Politics is a mind killer. But this stuff matters. And I trust it has been explained here that non-scientists are best off not trying to second guess the science, but relying on the expert opinion. The global warming is our first example of the manmade problems which are going to kill us if there is no AI. The engineered diseases, the gray goo, that sort of stuff comes later, and will likely be equally controversial. For now we have coal.

The uFAI risk also is going to be extremely controversial as soon as those with commercial interests in the AI development take notice - way more controversial than AGW, for which we do have fairly solid science. If we cannot discuss AGW now, we won't be able to discuss AI risks once Google - or any other player - deems those discussions a PR problem. The discussions at any time will be restricted to the issues about which no-one really has to do anything at the time.

Personal Blog

-4

36 comments, sorted by Highlighting new comments since Today at 7:45 AM
New Comment

A question: why anything about global warming gets downvoted, even popularly readable explanation of the fairly mainstream scientific consensus?

I question the generalisation "Of the comments by User:Dmytry that have been downvoted recently some of them have been about global warming" --> "All discussion of global warming gets downvoted". In fact, the claim can be trivially refuted by finding discussion relating to global warming that is not downvoted. As of the time of this comment one can find examples of comments discussing global warming that are upvoted or neutral by following the link to the Dmytry's comment page, finding the heavily downvoted comments about global warming and then following links to the (currently upvoted) parent and children comments.

Once again, not all instances of people downvoting are part of a conspiracy. Sometimes it just means people disagree with you or object to your style.

As for whether the downvoted comments in question are, in fact, "popularly readable explanation of the fairly mainstream scientific consensus" - I have no idea. I have very little interest in the subject and have not followed the conversation closely. Someone else would have to give their evaluation.

As far as I'm concerned the global warming is a known problem with known insurmountable political (cooperation) problems preventing us from taking the drastic measures needed to solve it.

Global warming is not an existential risk regardless of its truth value.

[-][anonymous]9y -1

Yeah, in the Bostrom sense of something irreversibly destroying all Earth-originating intelligent life or irreversibly preventing it from colonizing other planets it isn't; but then neither is thermonuclear war -- several hundred millions, possibly billions of people would likely survive one.

but then neither is thermonuclear war

We don't often discuss thermonuclear war in LessWrong either.

Yes, probably it's not going to destroy all ape-originated intelligent life, provided it doesn't set off

http://en.wikipedia.org/wiki/Anoxic_event

It is still going to kill a fairly significant number of us, and it still requires giving up something for prevention; if you can't discuss it reasonably, then note that you are even less capable of discussing any issues that trigger more fear or require larger change.

There's a three-pronged answer to this, as I see it.

First: there's a tacit moratorium on partisan-coded issues around here which do not directly concern the science of rationality or (to a lesser extent) AI. Even those on which a broad consensus exists: the reasoning most often given is that a vocally partisan position on such topics would position LW to attract like-minded partisans and thus dilute its rationality focus. The politics of religion is something of an exception; it's essentially treated as a uniquely valuable example of certain biases, though I suspect that status in practice has more to do with the grandfather clause. Anthropogenic global warming is not a uniquely valuable example of any bias I can think of; it's a salient one, but salience often comes with drawbacks.

Second: LW is not a debunking blog, nor a forum dedicated to cheering scientific consensus over folk wisdom, and it should not be except insofar as doing so serves the art and science of rational thinking. There's considerable overlap between LW's natural audience and that of sites which are devoted to those topics, which has on occasion misled (often ideologically opposed) newcomers into thinking it's such a site, but even if a general consensus exists that LW's theory and practice tends to lead to certain positions, it behooves us to guard against adopting those positions as markers of group identity. The easiest way to do that is not to talk about them.

Third, and probably most embarrassingly from the standpoint of healthy group epistemology: by the last census/survey LW is disproportionately politically libertarian, though adherents of that ideology are an absolute minority ([left-]liberalism is slightly more popular, socialism slightly less, other political theories much less). The severity of, proper response to, and to a lesser extent existence of anthropogenic global warming remains an active topic of debate in libertarian circles, though less so in recent years. Higher sensitivity to AGW than to other conservative-coded positions may in part be a response to these demographics.

The global warming is our first example of the manmade problems which are going to kill us if there is no AI.

Er, what? What about diseases promoted by high-population densities in cities and spread via modern international travel? Surely those will kill many more - and are more important in practically every way.

Yes. But if you can't reasonably talk about AGW without going into some form of denial - you won't be able to talk about the diseases rationally either. The ability to talk rationally rapidly falls off with the scariness of the scenario; fear and importance of issue do not make you think clearer; only less clearly. AGW is a stepping stone of what such issue looks like when there is high quality science on the issue, telling us rather dis-comfortable things, and when we have to give up something for prevention.

The global warming is our first example of the manmade problems which are going to kill us if there is no AI.

Road traffic accidents kill over a million people per year. I doubt global warming will ever kill people at anywhere near that rate. But for all that, we talk about global warming a lot more than road accidents.

Pneumonia kills over 100 million people each year (!?! actually more like 4 million per year).

I doubt global warming will ever kill people at anywhere near that rate.

In fact, since - as Freeman Dyson says - more people die from cold than from heat, global warming will probably reduce the overall death rate:

More people die from cold in winter than die from heat in summer.

According to "this report about the study "Causes for the recent changes in cold- and heat-related mortality in England and Wales":

Warming is highly beneficial to human health, even without any overt adaptation to it. And when adaptations are made, warming is incredibly beneficial in terms of lengthening human life span.

The study says that changes in heat-related mortality in the UK are smaller for warming than for cooling over the past 4 decades by two orders of magnitude.

However, one study produced different findings in the USA - here.

Pneumonia kills over 100 million people each year.

Where did you get that number? Wikipedia puts it at 4 million per year.

Oops - over 100 million people each year contract pneumonia.

[-][anonymous]9y 2

For that to be important it is sufficient to have 50/50 risk of global warming Even probabilities less than 0.5 for the 'strong' warning scenarios still are a big factor - in terms of 'expected deaths' and 'expected suffering' considering how many humans on this planet lack access to air conditioning.

The main problem with global warming is not that people who can't afford air conditioning will be less comfortable -- the warming is a few degrees, so people wouldn't even notice it without measurements. The problem is that the warming might lead to rising sea levels (and there are a helluva lot of people living near the sea), and maybe (I'm not sure these things are well-understood) more and stronger hurricanes, and stuff like that.

Nah, they simply won't notice the death rate increase without statistics, doesn't mean it won't be increasing; when you have population of camels under the load that breaks significant percentage of camel backs... adding extra weight has linear effect for small weights.

The reason it is a fairly productive topic wrt charity (in general), is that it is easy to rationalize lack of action, but it is harder to rationalize positive action that kills. Yes, theoretically, biases are bad for giving, practically, eliminating biases in giving decreases the giving (i think there was even a link to a study posted here, about that). People are biased and imperfect and are more likely to donate to better causes if one is aware that one is committing action that kills, rather than mere inaction.

There's an extent to which I am totally willing to have a moratorium on global warming. After all, you don't try to convince a religious fundamentalist that they're wrong by going "you're totally wrong, you have no sound basis for thinking the bible is inerrant" - anything that even assumes that is just going to raise their cognitive walls. You change their mind by helping them develop their critical thinking skills. Similarly, I wouldn't say to a fundamentalist-equivalent skeptic "here, have a link to some IPCC report that shows that you're wrong, I'm sure you can check it out yourself." Counterproductive again. So even if theism or global warming are useful examples, I'm willing to avoid them, or certain statements about them, in many cases.

But at the same time I'd rather not just ignore the issue. Fortunately, I've found a certain forum that lets community members write posts to develop readers' critical thinking skills :D So maybe I'll do that.

Well, it is hard to discuss the improvement in productivity of arguments without any specific examples of arguing. It is hard to think correctly about abstract topics; people perform Wason Selection Task better when it is described in terms of anything specific, than letters and numbers.

And I trust it has been explained here that non-scientists are best off not trying to second guess the science, but relying on the expert opinion.

Not saying anything directly about the immediate topic, argument screens off authority and we really do have examples of experts getting things terribly off. Even if that weren't the case, "trust the experts" would still be a terrible heuristic as it's likely to be gamed.