dammit, you could have told me that before I spent so much time building this flying machine made of bone and flesh...
It's amazing how many supposedly rationalist movements fall into the trap of crippling "reverse stupidity." Many in the atheist movement would not have you make positive pronouncements, not have you form organizations, not have you advocate, not have you adopt symbols or give the movement a name, not have you educate children on atheism, and so on, all because "religion does it." I think in the case of atheism the source is unique: every (modern) atheist knows his or her atheism is a product of scientific understanding but few atheists are willing to admit it (having taken up also the false belief that some things are "outside science"), so they go looking for other reasons, and "reverse stupidity" offers such reasons in abundance.
"... you have to directly challenge the arguments of Nick Bostrom or Eliezer Yudkowsky post-2003."
Just what the heck happened in 2003? In any experimental field, particularly this one, having new insights and using them to correct old mistakes is just part of the normal flow of events. Was there a super-super-insight which corrected a super-super-old mistake?
He's referring to his coming of age as a rationalist (which he hadn't written yet then); his transhumanist ideas before 2003 were pretty heavily infected with biases (like the Mind Projection Fallacy) that he harps on about now.
If the same majority of smart people as stupid people are conservative then the statement that "Not all conservatives are stupid, but most stupid people are conservatives." is actually completely irrelevant, but I don't think that anyone believes otherwise. If there is a positive correlation between intelligence and the truth of one's beliefs (a claim the truth of which is probably assumed by most people to be true for any definition of intelligence they care about) then the average intelligence of people who hold a given belief is entangled with the truth of that belief and can be used as Bayesian evidence. Evidence is not proof of course, and this heuristic will not be perfectly reliable.
Why would the number of stupid people who believe something anticorrelate with the number of smart people who believe it? Most stupid people and most smart people believe the sky is blue. A shift in the fraction of stupid people who do X can take place without any corresponding shift in the fraction of smart people who do X one way or another. Some smart people actively prefer not to affiliate themselves with stupid people and will try to believe something different, but they are committing the error of the OP and should not be listened to anyway.
I believe it was John Stuart Mill who said that.
Nice move using Stalin instead of Hitler, since I get tired of hearing the latter brought up. I myself have endorsed some of Stalin's ideas like "[ideology] in one country" since even if his policies were bad he was at least fairly successful in getting them implemented and lasting for a good while.
every (modern) atheist knows his or her atheism is a product of scientific understanding
This is wrong.
Even presuming that you're speaking very informally, and your statement shouldn't be interpreted literally, it's STILL wrong.
"The least convenient path is the only valid one."
When arguing against and idea honestly with the strongest advocates, is it always true that what is right is not always what is easy? Does making the choice not to argue make someone wrong outright or does not entering into the argument in the first place make the point of view non-existent in some way?
Clarification: Just yudkowsky after 2003 or yudkowsky and bostrom together, perhaps sharing the same mistake? It would be usefull to know so I don't make the same mistake, et al.
Matthew: Just me after 2003, not Bostrom.
I call the experience my "Bayesian enlightenment" but that doesn't really say anything, does it? Guess you'll have to keep reading Overcoming Bias until I get there.
michael vassar: You're right when you say a correlation of intelligence with liberalism is evidence for liberalism, but that's not because the stupid people are conservative, it's because the smart people are liberal. At least I think that's what Eliezer meant.
Though you could see the conservativeness of stupid people as strengthening the evidence provided by smart liberal people because it points at there being more of a conservative human baseline to deviate from.
"""A car with a broken engine cannot drive backward at 200 mph, even if the engine is really really broken."""
"When the player's truck is put into reverse, the truck will accelerate infinitely; however, the truck will halt instantly when the reverse key is released."
"I call the experience my "Bayesian enlightenment" but that doesn't really say anything, does it?"
Note to readers: Eli discovered Bayesian probability theory (in general) much earlier than 2003, see http://www.singinst.org/upload/CFAI//design/clean.html#programmer_bayesbinding.
"You're right when you say a correlation of intelligence with liberalism is evidence for liberalism, but that's not because the stupid people are conservative, it's because the smart people are liberal."
If you assume the population is partitioned into liberals and conservatives, a high percentage of stupid conservatives implies a high percentage of smart liberals, and vice-versa. If smart liberals are Bayesian evidence for B, then smart conservatives must be Bayesian evidence against B (note that 'smart' here is relative to the average, not some absolute level of smartness).
Can we agree on the following: if you pick a random stupid person and ask for an opinion on B, and the stupid person says B is false, this cannot be evidence against B unless you have background knowledge on the fraction of people who think B, in which case all the work is really being done by the indirect inference about the opinions of smarter people, so calling the stupid person's opinion negative evidence is misleading even if strictly speaking correct?
Isn't the truth of a thing (such as a sentence or artwork) determined by how closely it matches reality? And the match-level is a function of the identity of reality and of the thing. So there is no mention of smart or dumb people anywhere in that.
Good post, and good job putting this into a common language framework. If you convince only one or two more people to think clearly, it was worth it! B
Steven: Yes we can, with the caveat you mentioned earlier about the human baseline. Of course, that point is plausibly precisely what Mill or whoever was pointing to with his comment.
this cannot be evidence against B unless you have background knowledge on the fraction of people who think B,
No. The "unless" clause is still incorrect. We can know a great deal about the fraction of people who think B, and it still cannot serve even as meta-evidence for or against B.
There is an ongoing confusion here about the difference between evidence and meta-evidence. It is as obvious and important as the difference between experimental analysis and meta-analysis, and it is NOT being acknowledged.
"No. The "unless" clause is still incorrect. We can know a great deal about the fraction of people who think B, and it still cannot serve even as meta-evidence for or against B."
This can't be right. I have a hundred measuring devices. Ninety are broken and give a random answer with an unknown distribution, while ten give an answer that strongly correlates with the truth. Ninety say A and ten say B. If I examine a random meter that says B and find that it is broken, then surely that has to count as strong evidence against B.
This is probably an unnecessarily subtle point, of course; the overall thrust of the argument is of course correct.
We can know a great deal about the fraction of people who think B, and it still cannot serve even as meta-evidence for or against B. There is an ongoing confusion here about the difference between evidence and meta-evidence.
No. From a Bayesian perspective, there is no difference other than strength. This is, of course, different from saying that the truth is what the authorities say it is, but I think that's what you're hearing it as.
Actually, if I'm not wrong (and it still confuses me), arguments from authority have a different conditional probability structure than "normal" arguments.
"You're right when you say a correlation of intelligence with liberalism is evidence for liberalism, but that's not because the stupid people are conservative, it's because the smart people are liberal."
That seems to me exactly wrong. A proposition's truth or falseness is not entangled in the intelligence of the people who profess the proposition. Alien cultists do not change the probability of poorly hidden aliens. Dumb people who argue for evolution over creationism do not raise the probability that Genesis is natural history, no matter how dumb they are. Conservative Proposition X will be true or not true regardless of whether it is supported by a very intelligent conservative or by a very dumb conservative.
From a Bayesian perspective, there is no difference other than strength.
That's precisely why Bayes' Theorem isn't all you need to know in order to reason. It's an immensely powerful tool, but a grossly inadequate methodology.
Again: there is a great deal of confusion about the difference between evidence and meta-evidence here.
If I do find someone whose statement seem to reliably anti-correlate with reality, am I justified in taking their making a statement as evidence that the statement is false?
Caledonian: please define meta-evidence, then, since I think Eliezer has adequately defined evidence. Clear up our confusion!
Eliezer has NOT adequately defined evidence. There is no data that isn't tied to every event through the operations of causality.
To say it abstractly: For an event to be evidence about a target of inquiry, it has to happen differently in a way that's entangled with the different possible states of the target. (To say it technically: There has to be Shannon mutual information between the evidential event and the target of inquiry, relative to your current state of uncertainty about both of them.) Entanglement can be contagious when processed correctly, which is why you need eyes and a brain. If photons reflect off your shoelaces and hit a rock, the rock won't change much. The r...
Yes Doug. Furthermore, if you can find a pair of people the difference of who's opinions seems to correlate with reality you can use that as evidence, which is the pattern pointed to by the original quote.
The definition Eliezer offered, and the way in which he used the term later, are not connected in any meaningful way. His definition is wrong.
And you haven't tried to define meta-evidence at all.
Do you know what a meta-analysis study is?
Beware of feeding trolls. If the one can offer naught but flat assertions, you may be better off saying, "Let the audience decide." If you engage and offer defense to each repeated flat assertion, you encourage them to do even less work in the future, since it offers the same attention-reward.
@yudkowsky I would be happy if I could judge the merit of Bayes for myself versus the frequentists approach. I doubt UTD faculty have seen the light, but who knows, they might. I wonder even more deeply if a thorough understanding of Bayes gives any insight into Epistemology? If you can answer Bayes does offer insight into epistemology I know for sure I will be around for many more months. If I remember correctly, we both have the same IQ (140) yet I am much worse at mathematics. OF course, my dad is an a/c technician, not a physicist.
I enjoy your hard work and insights Eliezer. Also Caledonians comments, mainly for their mystery.
Likewise, if you attempt to engage people who make foolish proclamations and ambiguous definitions, it can reward them with attention and conversation. The benefits to puncturing shoddy arguments are often greater than the prices that need to be paid to do so.
Eliezer has repeatedly offered a definition for a term, gone on to mention that this definition is incomplete, and then failed to explicitly refine the definition or provide a process for the reader to update it. Despite recognizing the fallacious nature of conclusions or arguments supported with su...
Caledonian,
What do you mean by meta-evidence? How isMr. Yudkowsky's definition of evidence not adequate for the use in this post?
How about this for a precise definition: A is evidence about B if p(A | B) != p(A | ~B).
Of course, by this definition, almost everything is evidence about almost everything else. So we'd like to talk about the strength of evidence. A good candidate is log p(A | B) - log p(A | ~B). This is the number that gets added to your log odds for B when you observe A.
Of course, by this definition, almost everything is evidence about almost everything else.
Ding ding ding!
It may even be the case that, by that definition, everything is evidence about everything else. And clearly that doesn't match our everyday understanding and use of the term - it doesn't even match our formal understanding and use of the term.
What's missing from the definition that we need, in order to make the definition match our understanding?
But everything is evidence about everything else. I don't see the problem at all.
But everything is evidence about everything else. I don't see the problem at all.
Given the circumference of Jupiter around its equator, the height of the Statue of Liberty, and the price of tea in China, can you tell me what's sitting atop my computer monitor right now?
If so, what is it?
If not, why not? I gave you plenty of evidence.
Steven, I reduxified your argument as Argument Screens Off Authority.
If not, why not? I gave you plenty of evidence.
Caledonian, you gave evidence, but you certainly didn't give plenty of it. I see you ignored the part of my post where I talked about how to quantify evidence. The important question isn't whether or not we have evidence; it's how much evidence we have.
Let me make an analogy. I can define sugar as sucrose; a specific carbohydrate whose molecular structure you can view on wikipedia. I might say that a substance is "sugary" if it contains some sugar. But by this definition, almost everything is sugary,...
Evidence is like gravity. Everything is pulling on everything else, but in most cases the pull is weak enough that we can pretty much ignore it. What you have done, Caledonian, is akin to telling me the position of three one-gram weights, and then asking me to calculate the motion of Charon based on that.
If, after I have pointed this out, you offer me some sugar cookies containing 1 molecule of sucrose, and then defend yourself by saying that according to my definition, they are indeed sugary, you are being obnoxious. I already told you how to quantify sugariness, and you ignored it for rhetorical reasons.
No, I'm not being obnoxious. I'm pointing out that your definition is bad by showing that it leads directly to common and absurd conclusions.
By Eliezer's definition, even the thing he offers as an example of a thing that isn't evidence IS STILL EVIDENC...
Everything is pulling on everything else, but in most cases the pull is weak enough that we can pretty much ignore it. What you have done, Caledonian, is akin to telling me the position of three one-gram weights, and then asking me to calculate the motion of Charon based on that.
So close... and yet, so far.
I agree with you that, even if I gave you absolute, complete, and utterly precise data on the three weights, there is no way you could derive the motion of Charon from that.
So: are the three weights evidence of Charon's movement?
For any that may be genuinely confused: If you read What is Evidence?, An Intuitive Explanation of Bayesian Reasoning, and A Technical Explanation of Technical Explanation, you will understand how to define evidence both qualitatively and quantitatively.
For the rest of you: Stop feeding the troll.
Caledonian is just trying to point out that the keys to rationalism are family values and a literal interpretation of the Bible. I don't know why you all can't see something so obvious.
Observe:
"It may even be the case that, by that definition, everything is evidence about everything else. And clearly that doesn't match our everyday understanding and use of the term - it doesn't even match our formal understanding and use of the term.
What's missing from the definition that we need, in order to make the definition match our understanding?"
Jesus.
&qu...
If this is the same Caledonian who used to post to the Pharyngula blog, he's barred from there now with good reason.
Is there a cognitive bias at work that makes it hard for people not to feed trolls?
Is there a mathematical expression in probability for the notion that unless someone is making a special effort (concerted or otherwise) they can't be any 'wronger' than 50% accuracy? Subsequently betting the other way would be generating evidence from nothing - creating information. Why no mention of thermodynamics in this post & thread?
Not to feed the troll or anything, but yes, the masses and positions of the three weights are evidence about Charon's movement. Why? Because if you calculated Charon's orbit without knowing their masses, positions etc,...
Not to feed the troll or anything, but yes, the masses and positions of the three weights are evidence about Charon's movement. Why? Because if you calculated Charon's orbit without knowing their masses, positions etc, you'd be less accurate than if you did.
Calculating Charon's orbit without knowing what direction Charon moves in, or even whether it moves at all, is an impossible task. You are substituting "Charon's orbit" for "Charon's movement" in your argument, then acting as though you have made a statement about Charon's movemen...
Ben Jones, I don't see the human existence of religion as having any evidential bearing on the existence of a Super Happy Agent sufficiently like a person and unlike evolution that theists would actually notice its existence. Pretty much the same probability as an object one foot across and composed of chocolate cake existing in the asteroid belt. For interventionist Super Happy Agents, same probability as elves stealing your socks.
Incidentally, with sufficiently precise measurements it's perfectly possible to get a gravitational map of the entire Solar System off a random household object.
Ben Jones, I don't see the human existence of religion as having any evidential bearing on the existence of a Super Happy Agent sufficiently like a person and unlike evolution that theists would actually notice its existence.
Any evidential bearing? Surely P(religion X exists|religion X is true) is higher than P(religion X exists|religion X is false).
Nick, I don't see how that follows for the supermajority of religions that are logically self-contradictory, except in the sense that if 1=2 then the probability of the Sun rising tomorrow is nearly 200%. Furthermore, Ben Jones asked about religion in general rather than any specific religion, and religion in general most certainly cannot be true.
In general, any claim maintained by even a single human being to be true will be more probable, simply based on the authority of that human being, than some random claim such as the chocolate cake claim, which is not believed by anyone.
There are possibly some exceptions to this (and possibly not), but in general there is no particular reason to include religions as exceptions.
Incidentally, with sufficiently precise measurements it's perfectly possible to get a gravitational map of the entire Solar System off a random household object.
Also incorrect. More than one configuration of masses can have exactly the same effect on the object. No matter how precisely you measure the properties of the object, you can never distinguish between those configurations.
I should add that this is true about self-contradictory religions as well. For the probability that I mistakenly interpret the religion to be self-contradictory is greater than the probability that the chocolate cake is out there.
"If God did not exist, it would be necessary to invent him."
Nick: Why should an atheistic rationalist have any more faith a a religion that exists than a religion that doesn't? I don't belive in God; the testimony of a man that claims he spoke to God in a burning bush doesn't sway me to update my probability. I Defy The Data!
My 'lack of faith' stems from a probability-based judgment that there is no Super Agent. With this as my starting point, I have as much reason to worship Yoda as I do God.
Ben Jones, I don't see the human existence of religion as having any evidential bearing on the existence of a Super Happy Agent sufficiently like a person and unlike evolution that theists would actually notice its existence. Pretty much the same probability as an object one foot across and composed of chocolate cake existing in the asteroid belt. For interventionist Super Happy Agents, same probability as elves stealing your socks.
Eli, you're just saying that you don't believe in the existence of a SHASLAPAUETTWANIE. But since you labeled it with: "....
Eli, you're just saying that you don't believe in the existence of a SHASLAPAUETTWANIE. But since you labeled it with: "...that theists would actually notice its existence," then clearly the existence of religion has some evidential bearing on the existence of a SHASLAPAUETTWANIE.
(Blink.)
Um, I concede to your crushing logic, I guess... what exactly am I conceding again?
Flying saucer cultism was helped along by secret Cold War technological advances that were accidentally witnessed by civilians.
For example, the famous 1947 Roswell incident was the crashing of an American strategic reconnaissance super-balloon that was supposed to float over the Soviet Union and snap pictures, which would then be recovered many thousands of miles away. That's why it was made out of the latest high-tech materials that were unfamiliar to people in small town New Mexico in 1947.
The KGB used to generate flying saucer stories in Latin America...
Steve, maybe this was your point anyway, but the incidents you mention indicate that the existence of flying saucer cults is evidence for the existence of aliens (namely by showing that the cults were based on seeing something in the real world.) No doubt they aren't much evidence, especially given the prior improbability, but they are certainly evidence.
Piper had a point. Pers’nally, I don’t believe there are any poorly hidden aliens infesting these parts. But my disbelief has nothing to do with the awful embarrassing irrationality of flying saucer cults—at least, I hope not.
You and I believe that flying saucer cults arose in the total absence of any flying saucers. Cults can arise around almost any idea, thanks to human silliness. This silliness operates orthogonally to alien intervention: We would expect to see flying saucer cults whether or not there were flying saucers. Even if there were poorly hidden aliens, it would not be any less likely for flying saucer cults to arise. The conditional probability P(cults|aliens) isn’t less than P(cults|¬aliens), unless you suppose that poorly hidden aliens would deliberately suppress flying saucer cults.1 By the Bayesian definition of evidence, the observation “flying saucer cults exist” is not evidence against the existence of flying saucers. It’s not much evidence one way or the other.
This is an application of the general principle that, as Robert Pirsig puts it, “The world’s greatest fool may say the Sun is shining, but that doesn’t make it dark out.”2
If you knew someone who was wrong 99.99% of the time on yes-or-no questions, you could obtain 99.99% accuracy just by reversing their answers. They would need to do all the work of obtaining good evidence entangled with reality, and processing that evidence coherently, just to anticorrelate that reliably. They would have to be superintelligent to be that stupid.
A car with a broken engine cannot drive backward at 200 mph, even if the engine is really really broken.
If stupidity does not reliably anticorrelate with truth, how much less should human evil anticorrelate with truth? The converse of the halo effect is the horns effect: All perceived negative qualities correlate. If Stalin is evil, then everything he says should be false. You wouldn’t want to agree with Stalin, would you?
Stalin also believed that 2 + 2 = 4. Yet if you defend any statement made by Stalin, even “2 + 2 = 4,” people will see only that you are “agreeing with Stalin”; you must be on his side.
Corollaries of this principle:
1Read “P(cults|aliens)” as “the probability of UFO cults given that aliens have visited Earth,” and read “P(cults|¬aliens)” as “the probability of UFO cults given that aliens have not visited Earth.”
2Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance: An Inquiry Into Values, 1st ed. (New York: Morrow, 1974).
3See Scott Alexander, “The Least Convenient Possible World,” Less Wrong (blog), December 2, 2018, http://lesswrong.com/lw/2k/the_least_convenient_possible_world/.
4See also “Selling Nonapples.” http://lesswrong.com/lw/vs/selling_nonapples.