The statistical evidence is that liberalism, especially social liberalism, is positively correlated with intelligence. This does not prove that liberalism is correct; but it does provide some mild evidence in that direction.

it does provide some mild evidence in that direction.

It would provide significantly useful evidence, if we had no other information to determine the truth of the tenets of conservatism. Given that we do, and that the 'evidence' provided by who believes liberalism vs conservatism is not strong, I suggest it is better to ignore it.

Why? Because using these sorts of arguments are very dangerous because they so readily degenerate into overvaluing social proof.

0BlueAjah7yDeclaration of bias: I am a liberal, I am intelligent, but I'm not a Democrat or Republican. It's hard to measure liberalism. For example, half the black people say they are conservative and half say they are liberal. But most outsiders would say most black people are liberal (and it's common for 100% of black people in an area to vote for Obama). People judge their liberalism against people like themselves, so it's hard to compare groups. If you count most black people as liberals, then that intelligence difference between liberals and conservatives might disappear (if it exists, I haven't checked). For example, it's a proven fact that Republicans are smarter than Democrats (because of black people with an average IQ of 85 voting Democrat), although just between white people there is no real difference. You also need to consider that intelligence comes with biases, even though it also improves your thinking. Intelligent people are biased towards things that benefit intelligent people, eg. complexity, even if they hurt other people. Intelligent people are biased towards letting people do whatever they want, because intelligent people like themselves will do sensible things when given the choice. They aren't used to stupid people, who do stupid things when allowed to do whatever they want. Intelligent people need freedom, while stupid people need strong inviolable guidelines about acceptable behaviour.
0Stephenjk8yHow are values are true or false. You seem to be arguing for objectivist morality. Consider, all the greatest minds in Philosophy, specifically ethics, believed in consequentialism. This provides no weight towards or against that particular ethical system. No one has value expertise. People can value one thing (security) or another (liberty). Inset whatever values as necessary. The same is true with progressives and conservatives generally. That fact provides no weight towards what we should value.

Reversed Stupidity Is Not Intelligence

by Eliezer Yudkowsky 3 min read12th Dec 2007113 comments

70


“. . . then our people on that time-line went to work with corrective action. Here.”

He wiped the screen and then began punching combinations. Page after page appeared, bearing accounts of people who had claimed to have seen the mysterious disks, and each report was more fantastic than the last.

“The standard smother-out technique,” Verkan Vall grinned. “I only heard a little talk about the ‘flying saucers,’ and all of that was in joke. In that order of culture, you can always discredit one true story by setting up ten others, palpably false, parallel to it.”

—H. Beam Piper, Police Operation

Piper had a point. Pers’nally, I don’t believe there are any poorly hidden aliens infesting these parts. But my disbelief has nothing to do with the awful embarrassing irrationality of flying saucer cults—at least, I hope not.

You and I believe that flying saucer cults arose in the total absence of any flying saucers. Cults can arise around almost any idea, thanks to human silliness. This silliness operates orthogonally to alien intervention: We would expect to see flying saucer cults whether or not there were flying saucers. Even if there were poorly hidden aliens, it would not be any less likely for flying saucer cults to arise. The conditional probability P(cults|aliens) isn’t less than P(cults|aliens), unless you suppose that poorly hidden aliens would deliberately suppress flying saucer cults.1 By the Bayesian definition of evidence, the observation “flying saucer cults exist” is not evidence against the existence of flying saucers. It’s not much evidence one way or the other.

This is an application of the general principle that, as Robert Pirsig puts it, “The world’s greatest fool may say the Sun is shining, but that doesn’t make it dark out.”2

If you knew someone who was wrong 99.99% of the time on yes-or-no questions, you could obtain 99.99% accuracy just by reversing their answers. They would need to do all the work of obtaining good evidence entangled with reality, and processing that evidence coherently, just to anticorrelate that reliably. They would have to be superintelligent to be that stupid.

A car with a broken engine cannot drive backward at 200 mph, even if the engine is really really broken.

If stupidity does not reliably anticorrelate with truth, how much less should human evil anticorrelate with truth? The converse of the halo effect is the horns effect: All perceived negative qualities correlate. If Stalin is evil, then everything he says should be false. You wouldn’t want to agree with Stalin, would you?

Stalin also believed that 2 + 2 = 4. Yet if you defend any statement made by Stalin, even “2 + 2 = 4,” people will see only that you are “agreeing with Stalin”; you must be on his side.

Corollaries of this principle:

  • To argue against an idea honestly, you should argue against the best arguments of the strongest advocates. Arguing against weaker advocates proves nothing, because even the strongest idea will attract weak advocates. If you want to argue against transhumanism or the intelligence explosion, you have to directly challenge the arguments of Nick Bostrom or Eliezer Yudkowsky post-2003. The least convenient path is the only valid one.3
  • Exhibiting sad, pathetic lunatics, driven to madness by their apprehension of an Idea, is no evidence against that Idea. Many New Agers have been made crazier by their personal apprehension of quantum mechanics.
  • Someone once said, “Not all conservatives are stupid, but most stupid people are conservatives.” If you cannot place yourself in a state of mind where this statement, true or false, seems completely irrelevant as a critique of conservatism, you are not ready to think rationally about politics.
  • Ad hominem argument is not valid.
  • You need to be able to argue against genocide without saying “Hitler wanted to exterminate the Jews.” If Hitler hadn’t advocated genocide, would it thereby become okay?
  • In Hansonian terms: Your instinctive willingness to believe something will change along with your willingness to affiliate with people who are known for believing it—quite apart from whether the belief is actually true. Some people may be reluctant to believe that God does not exist, not because there is evidence that God does exist, but rather because they are reluctant to affiliate with Richard Dawkins or those darned “strident” atheists who go around publicly saying “God does not exist.”
  • If your current computer stops working, you can’t conclude that everything about the current system is wrong and that you need a new system without an AMD processor, an ATI video card, a Maxtor hard drive, or case fans—even though your current system has all these things and it doesn’t work. Maybe you just need a new power cord.
  • If a hundred inventors fail to build flying machines using metal and wood and canvas, it doesn’t imply that what you really need is a flying machine of bone and flesh. If a thousand projects fail to build Artificial Intelligence using electricity-based computing, this doesn’t mean that electricity is the source of the problem. Until you understand the problem, hopeful reversals are exceedingly unlikely to hit the solution.4

1Read “P(cults|aliens)” as “the probability of UFO cults given that aliens have visited Earth,” and read “P(cults|aliens)” as “the probability of UFO cults given that aliens have not visited Earth.”

2Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance: An Inquiry Into Values, 1st ed. (New York: Morrow, 1974).

3See Scott Alexander, “The Least Convenient Possible World,” Less Wrong (blog), December 2, 2018, http://lesswrong.com/lw/2k/the_least_convenient_possible_world/.

4See also “Selling Nonapples.” http://lesswrong.com/lw/vs/selling_nonapples.

70