A part which seems missing in the discourse -- probably because of politeness or strategy -- is that there are more than two sides, and that people on your side don't necessarily share all your values. When someone tells you: "Harry, look how rational I am; now do the rational thing and follow me in my quest to maximize my utility function!" it may be appropriate to respond: "Professor Quirrell, I have no doubts about your superb rationalist skills, but I'd rather use my own strategy to maximize my utility function." Your partner doesn't have to be literally Voldemort; mere corrupted hardware will do the job.
On the battlefield, some people share the common goal, and some people just enjoy fighting. Attacking the enemy makes both of them happy, but not for the same reasons. The latter will always advocate violence as the best strategy for reaching the goal. (The same thing happens on the other side, too.)
And an imporant part of the civilizing process Scott described is recognizing that both your side and the other side are in a constant risk of being hijacked by people who derive their benefits from fighting itself, and who may actually be more similar to their c...
The problem with Yvain's reply is that he omits the main reason why lying is a bad idea. Yvain compares lying to violence. I don't think this is a good comparison. It's acceptable to respond to violence with violence. It's not a good idea to respond to lies with lies.
Eliezer touched on this issue in his post here where he pointed out that one problem with lying to support a cause is that you'd better be absolutely sure that all your beliefs about the cause and what to do for it are in fact correct. However, the problem is even worse, there is a vicious cycle here since a cause that frequently lies is much more likely to acquire incorrect beliefs.
Think about it this way: suppose you believe that your cause justifies lying, so you lie about it. Your lies attract people to your cause who believe those lies. They in turn make up further lies (that they think are justified based on the lies they believe to be true). And so no until your cause's belief system is full of falsehoods and anti-epistomology. Your cause may ultimately "win" in the sense that it's followers acquire power, but by that point said followers may no longer care about your original goal. Even if t...
A pithy way of summarizing the above comment:
If someone tells you his cause is so important that lying for it is justified, assume he's lying.
I'm with Scott. It's so natural to think that if your enemies are as ruthless as the Tsars and their goons, you need to be as ruthless as the Bolsheviks to fight them. But we all know how that worked out, and it hardly seems to be an outlier; rather, it seems to be the norm for those willing to sink to their opponents' level. If the goal is victory for our cause, and not just victory for some people who find it convenient to claim to be cheerleaders for our cause, we need to be very careful that our tactics are not training up Stalins within our ranks. Not that I'm advocating total purity at all times and in all respects, but I think before playing dirty you need to make sure you have a much better reason to think it's a good idea than "the other guys are doing it."
If the goal is victory for our cause, and not just victory for some people who find it convenient to claim to be cheerleaders for our cause, we need to be very careful that our tactics are not training up Stalins within our ranks.
Well said. Also, an additional benefit of rational discussion is that it promotes truthseeking - people may discover that the cause that they're supporting is not the cause that they should be supporting. Under a "win at all costs" paradigm, arguments against your position are enemy soldiers, so if you win, it'll be without seriously considering the arguments of the opposition. That increases the likelihood of you being wrong. If your goal is something beyond personal power - if it's something like "the correct thing should win and become dominant" and not "I, as I am now, should win and become dominant" - then honest discussion is even more useful.
I'm not entirely convinced that the relationship between crafting a rational argument and crafting a persuasive argument is nearly as inverse-correlational as implied. On average, lies have a higher manufacturing cost (because you have to tread carefully and be more creative), a greater risk (since getting caught will lower your overall persuasiveness), and a smaller qualitative gain (while lies probably persuade more people, I suspect that they persuade less rationalists than civil debate and are therefore less qualitative overall). There are other means of persuading people without making deliberately irrational arguments. If sound reasoning alone isn't tasteful enough for you, why not season your truth with charm instead of coating it in sophistry? Why not leverage charisma or cordiality? You know - the dark art of sucking up?
While fear is often heralded in psychological communities as the most effective mechanism of persuasion, that doesn't mean it's the mechanism of persuasion with the greatest utility. A well-beaten child might obey best, but obedience isn't the only goal of discipline - nor agreement the only goal of argumentation. Personally, I'd rather treat every worthy cause as an opportunity for non-rationalists to exercise rationality than as an excuse for rationalists to manipulate non-rationalists. This tactic might not win every argument now, but it lays a surer foundation on which to build our arguments in the future.
Whether or not the lawful-goods of the world like Yvain are right, they are common. There are tons of people who want to side with good causes, but who are repulsed by the dark side even when used in favor of those causes. Maybe they aren't playing to win, but you don't play to win by saying you hate them for for following their lawful code.
For many people, the lawful code of "I'm siding with the truth" comes before the good code of "I'm going to press whatever issue." When these people see a movement playing dirty, advocating arguments as soldiers, where you decide whether to argue against it based on whether it's for your side rather than whether it's a good argument, getting mad at people for pointing out bad arguments from their side, they begin to suspect that your side is not the "Side of Truth". So you lose potential recruits. And the real Sith lords, not the ones who are trying to use the dark side for good, will have much less trouble hijacking your movement with the lawful-goods and their annoying code and the social standards they impose gone.
Leaving aside the honor among foes idea, and the "what if you're really the villain" idea,...
Chu's position -- at least, as presented at Yvain's blog -- seems to dip into the realm of being a guardian of truth. To me, that position is always scary... even if it comes from the "good guys".
I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly
I, for one, have the impression me that the more dire the consequences, the more important honesty in arguments becomes. So, I don't really get your dilemma.
Obamacare only became law because Obama lied by saying that under the law "If you like your health care plan, you can keep it." PolitiFact made this their lie of the year.
I suspect that many on the left knew at the time Obama was lying about this but kept quiet because they really wanted the law to pass. They won.
To credibly show the truth. Claims of Hitler-equivalent societal doom are a dime a dozen. Almost all of them are false.
Scott's examples have a fair amount of selection bias. If you take Chile, Russia, NK, or Zimbabwe, those who play dirty prevail. However, I agree that building a walled garden and making it attractive to join is a far better strategy whenever feasible.
Nice. This is another confirmation of something that's becoming increasing apparent to me, and raises the same issue I've been thinking about.
I'm of the rationalist libertarian persuasion. We value truth, honesty, and a lack of coercion in human interaction. When you argue, you argue honestly. You don't lie, you admit when the other side scores points, etc. Politically, you respect the freedom and equal rights of others, and don't use force to violate those rights. But we live in a world of people who do not share those values. By our lights, these people...
The fact that we don't shoot each other literally and verbally is one thing that allows a website like LessWrong to exist.
The alternative would be splitting the website into dozen subsites: More Right, More Left, More Free, More Feminist, More Vegetarian, etc., which I suspect wouldn't remain rational for too long, although some of them might keep the word "rationality" as their local applause light.
Would that improve the world? My first guess is that these diverse websites would mostly cancel out each other, so the result would be zero. As an impact on their personal lives, they would probably spend less time studying, and more time inventing smart sounding political arguments. Which already other big parts of internet are doing, so they would be just another drop in the ocean.
Scott just responded here, with a number of points relevant to the topic of rationalist communities.
I would assume there was supposed to be a link there?
Obviously, at some point being polite in our arguments is silly.
I think you seldom convince someone to change his opinion by name calling.
I once went to a talk about the implications of neurology on economics. Unfortunately for the professor who gave the talk he had a badly dressed conspiracy theorist in his audience who was upset about the professor providing a new way to justify the economic status quo. That talk would have benefited from throwing out the conspiracy theorist instead of being nice to him. The reason isn't that the conspiracy theorist ...
I think that a conception of heroic morality (basically, whether or not to use TDT, or choosing between act and rule utilitarianism) may be at the heart of many of the choices to be cooperative/nice or not. Many people seem to assume that they should always play the hero, and those more virtuous ones who don't seem to think that you would never be able to play the hero.
As an example, consider assassinating Hitler. It's not clear how Hitler could reprise this -- he is already killing people who disagree with him, and he is a single tyrant while you are an invisible individual. This does not apply, however, if you are in equal factions, say Fascists and Communists.
I don't understand all the consequentialist arguments against playing dirty. If your only objections are practical, then you're open to subtle dirty maneuvers that have very high payoffs.
A really simple example of this would be to ignore articulate opponents and spend most of your energy publicly destroying the opposition's overzealous lowest-common-denominators. This is actually how most of politics works...
... and also how this conversation seems to be working, since the Scott Alexander side seems more intent on arguing through hyperbole than addressing...
Just think about how much more persuasive fighting dirty sounds if the whole fate of the human race hangs in the balance. As is, there is an underlying assumption that we have infinite time to grind down our opposition with passive logical superiority.
"How dire [do] the real world consequences have to be before it's worthwhile debating dishonestly"?
M̶y̶ ̶b̶r̶i̶e̶f̶ ̶a̶n̶s̶w̶e̶r̶ ̶i̶s̶:̶
One lower bound is:
If the amount that rationality affects humanity and the universe is decreasing over the long term. (Note that if humanity is destroyed, the amount that rationality affects the universe probably decreases).
T̶h̶i̶s̶ ̶i̶s̶ ̶a̶l̶s̶o̶ ̶m̶y̶ ̶a̶n̶s̶w̶e̶r̶ ̶t̶o̶ ̶t̶h̶e̶ ̶q̶u̶e̶s̶t̶i̶o̶n̶ ̶"̶w̶h̶a̶t̶ ̶i̶s̶ ̶w̶i̶n̶n̶i̶n̶g̶ ̶f̶o̶r̶ ̶t̶h̶e̶ ̶r̶a̶t̶i̶o̶n̶a̶l̶i̶s̶t̶ ̶c̶o̶m̶m̶u̶n̶i̶t̶y̶"̶?̶
R̶a̶t...
Scott, known on LessWrong as Yvain, recently wrote a post complaining about an inaccurate rape statistic.
Arthur Chu, who is notable for winning money on Jeopardy recently, argued against Scott's stance that we should be honest in arguments in a comment thread on Jeff Kaufman's Facebook profile, which can be read here.
Scott just responded here, with a number of points relevant to the topic of rationalist communities.
I am interested in what LW thinks of this.
Obviously, at some point being polite in our arguments is silly. I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly.