Abstract: Test the world-models [at least somewhat] scientifically by giving others and yourself opportunity to generate straightforwardly and immediately testable factual predictions from the world-model. Read up facts to make sure you are not wrong before posting, not only to persuade.

I have this theory: there are people with political opinion of some kind, who generate their world-beliefs from that opinion. This is a wrong world-model. It doesn't work for fact finding. It works for tribal affiliations. I think it is fair to say we all been guilty of this on at least several occasions, and that all of us do it for at least some problem domains. Now, suppose you have some logical argument that contradicts other people's world-model, starting from very basic facts. And you are writing an article.

If you source those basic facts, there's what happens: the facts are read and accepted, the reasoning is read, the conclusion is reached, the contradiction with political opinion gets noted, the political opinion does NOT get adjusted, the politically motivated world-model generates a fault inside your argument, you get entirely counter productive and extremely irritating debate about semantics or argumentation techniques. In the end, not a yota changes about the world model of anyone involved in the debate.

If you don't source those basic facts, there's what happens: the facts are read and provisionally accepted, the reasoning is read, the conclusion is reached, the contradiction with political opinion gets noted, the political opinion does not get adjusted, the politically motivated world model generates wrong fact expectations about basic, easily testable facts. The contradiction eventually gets noted, the wrong world-model gets a minor slap on the nose, and actually does decrease in it's weight ever so slightly for generating wrong expectations. The person is, out of necessity, doing some actual science here - generating testable hypotheses from their theory, about the facts they don't know, having them tested (and shown wrong, providing feedback in somewhat scientific manner).

Unfortunately, any alterations to world model are uncomfortable - the world models, as memes, have a form of self preservation - so nobody likes this, and the faulty world-models produce considerable pressure to demand of you to source the basic knowledge upfront, so that the world-model can know where it can safely generate non-testable faults.

Other giant positive effect (for the society) happens when you are wrong, and you are the one who has been generating facts from world-model. Someone looks up facts, and then blam, your wrong world-model gets a slap on the nose.

Unfortunately that mechanism, too, makes you even more eager to provide and cut-n-paste citations for your basic facts, rather than state the facts as you interpret them (which is far more revealing of your argument structure, forwards facts to conclusion vs backwards conclusion to facts).

One big drawback is that it is annoying for those who do not actually have screwed up world-models, and just want to know the truth. These folks have to look up if assertions are correct. But it is not such a big drawback, as them looking up the sources themselves eliminates effects of your cherrypicking.

Another drawback is that it results in generation of content that can look like it has lower quality. In terms of marketing value, it is a worse product - it might slap your world model on the nose. It just doesn't sell well. But we aren't writing for sale, are we?

Other thing to keep in mind is that the citations let separate hypotheses from facts, and that is very useful. It would be great to do so in alternative way for basic knowledge. By marking the hypotheses with "i think" and facts with strong assertions like "it is a fact that". Unfortunately that can make you look very foolish - that fool is sticking his neck out into guillotine of testable statements!. Few have the guts to do that, and many of the few that do, may well not be the most intelligent.

And of course it only works tolerably well when we are certain enough that incorrect factual assertions will quickly be challenged. Fortunately, that is usually the case on the internet. Otherwise, people can slip in the incorrect assertions.

Ahh, and also: try not to use the above to rationalize not looking up the sources because it's a chore.

edit: changed to much better title. edit: realized that italic is a poor choice for the summary, which needs to be most readable.

New to LessWrong?

New Comment
23 comments, sorted by Click to highlight new comments since: Today at 5:22 AM

My summary of your post:

People who don't like an argument for political reasons will attack either the argument's facts or the argument's reasoning. If the facts are very well-cited, people are less likely to attack the facts. This is bad because we want people to attack the facts; reasoning-based arguments quickly devolve into worthless semantics, whereas fact-based arguments will eventually provide some evidence for or against a political theory. Therefore, some citations, especially the weaker or more controversial citations, should ideally be left out of political arguments.

My critique:

  • People who don't like an argument for political reasons will attack its facts and reasoning whenever they see an opportunity to do so -- there is no XOR function here. If you are debating politics with barely rational people, you will get into a semantic slugfest regardless of how well-cited your facts are.

  • I think you overestimate how likely people are to look up un-cited facts. Even if people are twice as likely to modify their opinions when they look up the damning facts themselves as when they read the damning facts in citations provided by an opponent, people are probably ten times as likely to read a citation as they are to go do independent research. Citations may be less effective for each person who reads them, but they'll be read by far more people, and I think the latter effect is stronger.

  • Even if your premises are correct, a better response would be to take care to strengthen your reasoning and make your assumptions and definitions explicit. Rather than spend effort weakening the apparent strength of the facts, spend effort enhancing the apparent strength of the syllogism. It's impossible to eliminate literally all logic/semantics based challenges, but it's quite doable to write a short essay that preempts all but the silliest semantic challenges, and (assuming you have an audience) people will realize fairly quickly that your lone heckler is trying to show off his rhetorical skills rather than trying to make an important point.

People who don't like an argument for political reasons will attack its facts and reasoning whenever they see an opportunity to do so -- there is no XOR function here.

I never claimed there was, and its entirely irrelevant to the point. I only claimed that they will attack the facts (also, the facts are first on the list and often people stop right there because laziness edit: that's it, not xor but 'or else' aka the garden variety or that stops if first part is true). After they attacked the facts, if the facts are presented, there is a minor decrease to the weight assigned to the world model that has led them to attack the facts. No big effects are claimed. Just usually there is a zero decrease, and that can be significant difference.

Even if your premises are correct, a better response would be to take care to strengthen your reasoning and make your assumptions and definitions explicit.

Strengthening of reasoning is most necessary when you form the opinion. Very often by time people express their opinion, all they are strengthening is the justifications for already formed opinion, what ever that opinion might be.

I think I get the point you're trying to make–that making people do more of the work of thinking on their own (forming ideology-based opinions on unsourced facts being true or untrue, and looking them up themselves) makes it more likely that they will change those ideologically-motivated beliefs if it turns out they were wrong about the facts.

I agree that the more time people spend thinking about a topic, the more likely they are to change their mind. That's what curiosity is. However, I don't know if your specific strategy (presenting controversial unsourced facts instead of citing the sources in your articles) would actually work. What is your evidence that incorrect factual assertions will quickly be challenged on the internet?

Also, people whose beliefs are ideologically motivated are not likely to be curious. Deciding that a controversial fact conflicts with their beliefs and thus must be untrue won't necessarily lead to them looking it up and then updating...it seems, based on my experience with this kind of person, that they would be more likely to ignore the facts because "the author didn't cite his sources, therefore is poorly educated/low-status/stupid, therefore I don't have to listen to him." On the contrary, an easy-going person with no particular opinion on a topic (like I've been guilty of in the past) might simply absorb the facts as written without bothering to look them up, since they don't conflict with any other beliefs and therefore aren't very interesting.

In my experience, most arguers bite the bullet, and actually do make assertions that facts are false. Political ones especially so. They see 'dangerous enemy propaganda', they go on to counter regardless of how curious they are, first thing to attack is the facts. Then their political motivation effectively slaps them on their nose. Instead of rewarding them with feeling of dopamine rush for having cleverly countered a complex argument.

Furthermore, this is not for complex stuff. This is for basic domain specific knowledge.

Regarding incorrect factual knowledge, well the correct factual knowledge sure always gets challenged by someone if there's no source attached, so why incorrect wouldn't? Everyone loves to be right. It's easy to be right about facts. (of course excluding edge cases, i.e. highly biased audiences)

Furthermore, it is poor form to learn domain specific knowledge from someone's argument, cited or not. One should use a textbook. Compiling bits from many sources needs to be done carefully, and that's what a good textbook does. Learning bits of info about complex topics out of context just for sake of processing an argument, is an approach that leads to much misunderstandings of the basics.

In my experience, most arguers bite the bullet, and actually do make assertions that facts are false. Political ones especially so.

I'll take your word for it. Maybe we argue with different sorts of people. I'm not really an arguing kind of person either, especially not on political subjects (blah blah blah boring...), so we may well have just had different experiences.

Regarding incorrect factual knowledge, well the correct factual knowledge sure always gets challenged by someone if there's no source attached, so why incorrect wouldn't?

I see no reason why this should be universally true. True most of the time in many kinds of Internet forums, maybe, but that's not as strong an assertion as saying it's always true.

Furthermore, it is poor form to learn domain specific knowledge from someone's argument, cited or not.

Agreed. But we're not talking about the ideal, perfect knowledge-acquiring human being here, we're talking about people as they are. (Or I assume we are, anyway, since ideologically-motivated, non-curious individuals are not exactly models of rationality either.) I consider myself an unusually widely-read person, and still in terms of domains that I don't care much about or don't find interesting, lots of 'facts' come from other people who do care about those domains bringing them up in conversation. This isn't a good thing, or any kind of model to hold up. It's just true about me, and probably about lots of other people. Of course it leads to misunderstanding of the basics, and that sucks, but that doesn't make it not true...and refusing to cite your sources in essays won't do anything about it either.

So, wait. You mean, it's better to make stuff up that you'd expect based on your worldview, because then when someone calls you on it you could change your mind?

If that's not what you mean, this could use a rewrite.

because then you get calibrated properly, and because then it is far easier for others to tell that you are rationalizing, as the facts are far cheaper to check. I don't propose you make up what you'd expect, it just naturally tends to happen for most people, and it'd better be producing detectable wrongness. Better for "us", of course you'd feel stupid.

edit: how can you even interpret it this way? "Read up facts to make sure you are not wrong before posting, not only to persuade." is in bloody abstract . You write what you think is true, you read up the facts to make sure you aren't wrong, if you are off, your expertise is too bad or your cognition too motivated, and you are likely to be incorrect.

Other giant positive effect (for the society) happens when you are wrong, and you are the one who has been generating facts from world-model. Someone looks up facts, and then blam, your wrong world-model gets a slap on the nose.

This is where I got that notion.

Yeah, it was sloppy of me to phrase it that way. Sorry about that.

What I see happening instead, is that people have the conclusion that they didn't make from facts, then they go on fishing for facts to support the conclusion, and if what they believed doesn't actually follow from facts, they introduce any errors into structure of the argument, where they can be denied all day long. No. You write down why you believed it in the first place, then your errors will be in the facts, then ideally you check if you actually mis remembered the facts, if you did, that makes it likely you believed it wrong in the first place.

The few times I ever seen people change their view online, was when they quickly dumped why they believed something, and then were shown that they believe it for wrong factual knowledge. (excluding few special cases with math where one can demonstrate errors).

And note: I am speaking of elementary domain specific knowledge here. The one that anyone with any expertise about the topic would have. You still aren't sourcing majority of assumptions you are making about this knowledge, just the assumptions are implicit not explicit, if you can't get a few explicit ones right from memory then you're no expert and must study the topic properly before making opinions.

Better to be testably wrong than to generate nontestable wrongness

The big question is, is it better to be testably wrong or untestably right? :-)

More seriously: on what objective or documented evidence do you base your theory? If it's just your general feeling based on personal experience (but you haven't exhaustively documented all relevant personal experience during your life) then I believe you know that human biases make this evidence practically negligible.

[+]Dmytry12y-50