Scott, known on LessWrong as Yvain, recently wrote a post complaining about an inaccurate rape statistic.

Arthur Chu, who is notable for winning money on Jeopardy recently, argued against Scott's stance that we should be honest in arguments in a comment thread on Jeff Kaufman's Facebook profile, which can be read here.

Scott just responded here, with a number of points relevant to the topic of rationalist communities.

I am interested in what LW thinks of this.

Obviously, at some point being polite in our arguments is silly. I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly.

New Comment
137 comments, sorted by Click to highlight new comments since:
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

A part which seems missing in the discourse -- probably because of politeness or strategy -- is that there are more than two sides, and that people on your side don't necessarily share all your values. When someone tells you: "Harry, look how rational I am; now do the rational thing and follow me in my quest to maximize my utility function!" it may be appropriate to respond: "Professor Quirrell, I have no doubts about your superb rationalist skills, but I'd rather use my own strategy to maximize my utility function." Your partner doesn't have to be literally Voldemort; mere corrupted hardware will do the job.

On the battlefield, some people share the common goal, and some people just enjoy fighting. Attacking the enemy makes both of them happy, but not for the same reasons. The latter will always advocate violence as the best strategy for reaching the goal. (The same thing happens on the other side, too.)

And an imporant part of the civilizing process Scott described is recognizing that both your side and the other side are in a constant risk of being hijacked by people who derive their benefits from fighting itself, and who may actually be more similar to their c... (read more)

0Ben Pace
cough cough
'kay, fixed
0Ben Pace
Tbh, I just found it funny that you said that when your example actually was Voldemort.
This part doesn't make clear enough the observation that X2 and Y2 are cooperating, across enemy lines, to weaken X1 and Y1. 2 being politeness and community, and 1 being psychopathy and violence.
Disclaimer: I mentioned psychopaths and violent people, but that's in a context of an actual war and actual killing. If we only speak about "fighting" metaphorically, we need to appropriately redefine what it means to be "violent". In context of verbal internet wars, the analogy of psychopaths would be trolls, and the analogy of people who enjoy violence would be people who enjoy winning debates. For the internet version of Genghis Khan, the greatest joy is to defeat his enemies in a public discourse, make them unpopular, destroy their websites, and take over their followers. The important thing is to win the popularity contest, having a better model of reality is only incidental. The thing to protect is the pleasure of winning, but other people's applause lights can be used strategically. A person from X1 has only friends in X1 and X2. A person from X2 has friends in X1, X2, Y2. Assuming that having more friends is an advantage, the mutual politeness creates an advantage for people from X2 and Y2, and this is why they are doing it. I'd call that cooperation. In their case, cooperation is both a strategy and a goal. In a way, also people from X1 and Y1 cooperate, but this cooperation is purely instrumental, as they hate each other. However, any act that successfully increases the mutual hate between groups X and Y helps them both, because it reduces their relative disadvantage against the 2.

The problem with Yvain's reply is that he omits the main reason why lying is a bad idea. Yvain compares lying to violence. I don't think this is a good comparison. It's acceptable to respond to violence with violence. It's not a good idea to respond to lies with lies.

Eliezer touched on this issue in his post here where he pointed out that one problem with lying to support a cause is that you'd better be absolutely sure that all your beliefs about the cause and what to do for it are in fact correct. However, the problem is even worse, there is a vicious cycle here since a cause that frequently lies is much more likely to acquire incorrect beliefs.

Think about it this way: suppose you believe that your cause justifies lying, so you lie about it. Your lies attract people to your cause who believe those lies. They in turn make up further lies (that they think are justified based on the lies they believe to be true). And so no until your cause's belief system is full of falsehoods and anti-epistomology. Your cause may ultimately "win" in the sense that it's followers acquire power, but by that point said followers may no longer care about your original goal. Even if t... (read more)

A pithy way of summarizing the above comment:

If someone tells you his cause is so important that lying for it is justified, assume he's lying.

This wins my personal "rationality quote of the decade" award.
One implication of this is that we can develop heuristics for how bad different lies. The basic idea is that lies that likely to spread (especially if their effectiveness depends on them spreading) are particularly bad. Especially if they're likely to spread within your movement (note lies used to increase support for your movement count here, since they'll bring in new recruits who believe them). Note: that using these heuristics we can see that the classic example used to justify lying: "There are no Jews in my basement" is in fact much less bad then Yvain's example: "A man is more likely to be struck by lightning than be falsely accused of rape."
Would you elaborate? First, I'm not sure what it means to say that "There are no Jews in my basement" is unlikely to spread. In a sense it's a "pre-spread" lie, since the lack of Gestapo breaking down your doors implies that they are all already fairly confident of the falsehood; you're just lying to decrease the probability that they'll stop believing it. Second, to add my own hypothetical: I can see an isomorphism (in terms of how the lie spreads) between "There are no Jews in my basement" and "There are no embezzled charity funds in my basement". Obviously this isomorphism doesn't extend to the morality of the lies, which makes it hard for me to see a connection between spreadability and immorality.
The Gestapo member is likely to have forgotten all about that specific lies by the time he finishes asking everyone on the block. Disagree, the lies themselves are comparable, the difference in morality comes from the difference between the goals the lies are being used for.

I'm with Scott. It's so natural to think that if your enemies are as ruthless as the Tsars and their goons, you need to be as ruthless as the Bolsheviks to fight them. But we all know how that worked out, and it hardly seems to be an outlier; rather, it seems to be the norm for those willing to sink to their opponents' level. If the goal is victory for our cause, and not just victory for some people who find it convenient to claim to be cheerleaders for our cause, we need to be very careful that our tactics are not training up Stalins within our ranks. Not that I'm advocating total purity at all times and in all respects, but I think before playing dirty you need to make sure you have a much better reason to think it's a good idea than "the other guys are doing it."

If the goal is victory for our cause, and not just victory for some people who find it convenient to claim to be cheerleaders for our cause, we need to be very careful that our tactics are not training up Stalins within our ranks.

Well said. Also, an additional benefit of rational discussion is that it promotes truthseeking - people may discover that the cause that they're supporting is not the cause that they should be supporting. Under a "win at all costs" paradigm, arguments against your position are enemy soldiers, so if you win, it'll be without seriously considering the arguments of the opposition. That increases the likelihood of you being wrong. If your goal is something beyond personal power - if it's something like "the correct thing should win and become dominant" and not "I, as I am now, should win and become dominant" - then honest discussion is even more useful.

Also, as I mentioned here even if your initial cause was right, by lying about it you'll attract people who believe your lies. Thus, eventually your cause is likely to morph into something that is a bad idea.
Every revolution eats it's own children.
1Ben Pace
Other than the annihilation of the baby-eaters... But otherwise a really cool quote.
Of course. The goal isn't to match the opponent, the goal is an effective strategy to further your own ends. Complete pacifism in the face of abuse is probably not it.
People seem to overestimate the effectiveness of playing dirty, though. Perhaps willingness to play dirty signals commitment, and I expect some of the time people are more interested in showing off their commitment than actually making progress toward the putative goal. But in any event, playing dirty has all sorts of costs (some discussed in this thread) which people seem to ignore or underestimate, and my only point is that it's a strategy to be employed only when it still seems like the best option even after all the costs and risks have been considered.
Yeah, that's pretty much my take. Often, signalling the willingness to play dirty without actually doing so gets us the collective benefits of "niceness, community, and civilization" while also getting us some extra individual benefits on top of that. And asserting that playing dirty is effective and that rational agents should be willing to play dirty can be an effecting way of signalling that willingness.
Until someone comes along reads all the stuff you wrote about the importance of playing dirty and believes you.
Or alternatively uses it to argue that you aren't trustworthy because you are willing to play dirty.
I've been considering to precommit to this: if someone in a group I'm a part of plays dirty or uses blackmail, I'll delete all of his/her reputation points in my head, and impose a moratorium on when he/she can start earning reputation points with me again. I would do this regardless of the success of what he/she did to the group. Is this wise?
3Scott Garrabrant
It is perhaps not wise to have such an all or nothing reaction to something that is as hard to define as "plays dirty" or "uses blackmail."
What do you mean by "complete pacifism"? The way to fight someone how spreads lies about you is not to spread lies about them, it's to spread the truth about them.
When I speak of fighting back, I'm talking about making them pay a cost, and not feeling constrained to play fair for their sake. They've forfeited that consideration. If you have overriding reasons to tell the truth, do so. But not to preserve value for them. When someone attacks you, it's time to destroy values for them.
Agreed, however, as I argue here the biggest reason for not lying for your cause isn't for their sake, it's for yours.

I'm not entirely convinced that the relationship between crafting a rational argument and crafting a persuasive argument is nearly as inverse-correlational as implied. On average, lies have a higher manufacturing cost (because you have to tread carefully and be more creative), a greater risk (since getting caught will lower your overall persuasiveness), and a smaller qualitative gain (while lies probably persuade more people, I suspect that they persuade less rationalists than civil debate and are therefore less qualitative overall). There are other means of persuading people without making deliberately irrational arguments. If sound reasoning alone isn't tasteful enough for you, why not season your truth with charm instead of coating it in sophistry? Why not leverage charisma or cordiality? You know - the dark art of sucking up?

While fear is often heralded in psychological communities as the most effective mechanism of persuasion, that doesn't mean it's the mechanism of persuasion with the greatest utility. A well-beaten child might obey best, but obedience isn't the only goal of discipline - nor agreement the only goal of argumentation. Personally, I'd rather treat every worthy cause as an opportunity for non-rationalists to exercise rationality than as an excuse for rationalists to manipulate non-rationalists. This tactic might not win every argument now, but it lays a surer foundation on which to build our arguments in the future.

I've heard this referenced somewhere as the difference between persuading someone and convincing them. You can apply rhetoric or logic until someone verbally accepts your arguments, but that is not the same as getting them to genuinely believe that what you are saying is true. Sometimes people will say "Okay, you're right" just to get you to shut up.
Hardly. It's much easier to throw bullshit at the wall than to clean it off. In many public debates, people shovel outright lies again and again. In the time it takes for you to properly evaluate their lie, they've shoveled 50 more. Also, their are lies, and then there is conceptual muddle that's not even false. Try cleaning that up. Conceptual muddle takes centuries to clean up. Since when? With whom? Who has paid enough attention to keep track? This is one of the fundamental problem with public debates - nobody is keeping score. If the people who already agree with you even notice, they'll likely shrug it off as a tactic, or just shift their attention to the next piece of bullshit supporting their views that they haven't yet seen through. It looks like you're suggesting that rationalists count more - somehow? Even if they do, they don't have the numbers. Rationalists are good if you want someone to produce useful epistemic truths. If you want to persuade masses of people, probably not so good. They don't persuade, and their numbers are so small persuading them doesn't take you very far in the aggregate. People use the Dark Arts because they're effective. Otherwise they'd be called the Dark Incompetencies.
You've got the wrong kind of fear there-- the effective use of fear is to make your listener afraid of some third party or event, not to make them afraid of you. If you make people afraid of you, they might give in, especially if you have physical power over them. You might get useful compliance that way. However, you're also likely to get people to avoid you if they can, or to push back compulsively.

Whether or not the lawful-goods of the world like Yvain are right, they are common. There are tons of people who want to side with good causes, but who are repulsed by the dark side even when used in favor of those causes. Maybe they aren't playing to win, but you don't play to win by saying you hate them for for following their lawful code.

For many people, the lawful code of "I'm siding with the truth" comes before the good code of "I'm going to press whatever issue." When these people see a movement playing dirty, advocating arguments as soldiers, where you decide whether to argue against it based on whether it's for your side rather than whether it's a good argument, getting mad at people for pointing out bad arguments from their side, they begin to suspect that your side is not the "Side of Truth". So you lose potential recruits. And the real Sith lords, not the ones who are trying to use the dark side for good, will have much less trouble hijacking your movement with the lawful-goods and their annoying code and the social standards they impose gone.

Leaving aside the honor among foes idea, and the "what if you're really the villain" idea,... (read more)

Could you post a screenshot or archived version of your Facebook link?

Chu's position -- at least, as presented at Yvain's blog -- seems to dip into the realm of being a guardian of truth. To me, that position is always scary... even if it comes from the "good guys".

I would personally suggest that anyone considering this issue seriously read the multitude of comments Chu made on the JKaufmann blog post thing. It may or may not change your mind but its solid evidence, and its easily accessible. If you took the time to read Yvain's post, the 2-10 minutes, depending on your reading speed, to read all of Chu's comments in their original context is time well spent.
Done. I think the most significant point Chu made which didn't come across in the other summaries was that "some ideas are inherently dangerous and must not be allowed to spread", and that neoreaction is among those. So I guess that a lot of the disagreement come down to how dangerous you believe the ideas are. A big reason I feel comfortable reading Moldbug looking for interesting points of view is that his ideas have lost so thoroughly---regardless of his feelings that black people would be better off as slaves, the probability that slavery will be reinstated in America is basically zero (except perhaps in a complete collapse of civilization). If I believed that discussing Moldbug carried an appreciable risk of destroying modern liberal society, then I wouldn't. (Indeed, since the pseudo-nazi revival in Greece in recent years, I have felt a bit less comfortable about Moldbug too. Suddenly, liberal democracy seems slightly less secure).
Upvoted for followthrough. How do you feel about less intense negatives, such as social regression? Things like how American conservatives essentially export their bullshit all over the world, such as the rise of American anti abortion tactics in Britain, and the role American conservatives player in the anti-gay movement in Russia? Or just certain anti-gay, anti-woman, anti-POC stances in America? For instance Arizona passed, or tried to pass, a law allowing for discrimination against LGBT and racial minorities under the grounds of religious freedom by businesses. At what point does it be come problematic enough that we should stop debating and crack down on these behaviors? Although, crackdown may be of varying levels. No need to send in the army to round up Arizona legislators.
Care to define what you mean by "social regression", also explain why it's a bad thing. Why should discrimination be illegal? Also while we're on the subject, should churches be forbidden to discriminate on religion?
What happens in America, influences significantly the rest of the world. Yes. But it's almost orthogonal to the question of how to win the battle within America. If the hypothesis "if you play nice, you are more likely to win (because people will enjoy joining your side), and if you play dirty, you are more likely to lose (because neutral people will hate you, and you will also have a lot of internal fighting)" is true -- which is the thing being debated -- then the fact that the outcome in America will strongly influence the rest of the world, just makes it more important to play nice in America. More meta: If you believe some strategy is the winning strategy, increasing stakes should make you follow the strategy more carefully, not abandon it.
This is not even an accurate summary of the law in question.
How sure are you that "modern liberal society" is in fact a good thing? What evidence convinced you to believe this? How sure are you that evidence wasn't fabricated by someone who also thought lying was justified to protect modern liberal society?
Upvoted, not because I have anything against modern liberal society, but because we should routinely question our beliefs.
My point is more than that. It is that by lying for a cause you've made it much harder to properly question any of its beliefs. After all, properly questioning something requires getting accurate data, which is much harder if you're also spreading false data about the subject.
Does Moldbug actually believe that?
That's unexpected.

I'd be interested in people's opinions of how dire the real world consequences have to be before it's worthwhile debating dishonestly

I, for one, have the impression me that the more dire the consequences, the more important honesty in arguments becomes. So, I don't really get your dilemma.

What if you are Jewish and are trying to stop a Hitler from coming to power and the best means would be to spread a deliberate lie about him. Are you saying that the worse the outcome would be, the less likely you would be to lie?
Nobody in this discussion is confronting a present or potential totalitarian state bent on murder so this feels like a tangent. In fact, this is a hypothetical that very few people are ever confronted with and therefore it isn't relevant to a question of practical ethics. Very few people are skilled enough at predicting the future to know when the situation is dire or whether dishonesty will work; very few people are skilled enough manipulators to pull it off. For the range of social issues the participants in this conversation are likely to confront, I think it's a good policy to be more careful and honest the higher the stakes. Among other things, the higher the stakes, the likelier a lie or mistake is to be caught. And being caught lying doesn't generally achieve any goal of the liar.

Obamacare only became law because Obama lied by saying that under the law "If you like your health care plan, you can keep it." PolitiFact made this their lie of the year.

I suspect that many on the left knew at the time Obama was lying about this but kept quiet because they really wanted the law to pass. They won.

[upvoted for giving a crisp recent, and plausible example of people getting away, at least in the short term, with dishonesty. I was a little squeamish about the politicization of the topic but I think it's hard to avoid giving a real political example in a conversation about political dishonesty] I take the point that there's a complicated collective-action problem here where if enough people repeat something they wish were true, it can become relatively accepted, at least for a while. The catch is that, as happened here, people often get caught having been dishonest. And we will see how painful the consequences are for those people personally and politically.
Obama doesn't use truth as a strategy but that doesn't change the fact that Cato was a very successful politician when it comes to people respecting his positions. The didn't lose but they also didn't get the single payer health care they wanted. I think US politics is ready for someone like Cato to come up and take it over. You don't win in politics by telling a bit less lies than your opponents. On the other hand actually being honest has it's advantages.
4Scott Alexander
I think the relevant axis may be short-term/specific vs. long-term/broader consequences rather than unimportant vs. important. I think defecting is usually a long-term bad strategy but a short-term good one. If you're pretty sure there's not going to be a long-term unless you fix your short-term problems immediately, defecting might be a good idea for you or your chosen cause - not sure about for the world at large.
If I lie about him, then the most likely consequence is that Hitler will have verified proof that "Jews are lying about me". So the consequence is that I would end up helping cause the holocaust, not stopping it. More generally what's the point of using a hypothetical scenario where the assumption is that the best means would be to spread a lie, when that's exactly what I'm contesting (that lying is the best means)? That's begging the question. Tell me in what exact way I'd be in an epistemic position to know that lying is the best means?
The set of things you could say is vastly larger than the set of true things you could say so unless lying is observed and punished you should assume that you are probably better off at least occasionally lying. I'm a game theorist and think that wearing makeup or acting more confident than you really are, are forms of lying that frequently benefit individuals.
I think I spent to much attention at optimizing things like the clothing I'm wearing and the way the background is arranged at my first TV interview. Being busy with tactics takes mental resources and builds anxiety. I would have probably done better if I would have spoken from a more relaxed state of mind that doesn't worry that much about the background of the image. Do you in fact wear makeup on a regular basis?
No makeup, but I do fake confidence.
Why no makeup? It's possible to use makeup as a man in a way that accentuates manly features.
I'm open to the idea, I do dye my hair.
I think that if you really look at the makeup question you will find that's not cost effective. A quick googling gives me the number that woman spent on average 91 hours per year applying makeup ( ). I think that a woman who rather spends the same amount of time in daily meditation sessions will get a higher return on her time investment. In a world full of superficial people there's not much comparative advantage at trying to be better at being more superficial than everyone else. I think it's a better strategy to compete based on personal depth. If you are open about who you are, that will make you more confident than if you walk around all the time with a mask.
Comparative advantage doesn't mean you can neglect something entirely. Personal attractiveness has large consequences on how people evaluate & treat you, and equally so for men and women, it looks like (Langlois et al 2000 (excerpts) claim gender is not a large moderator of beauty effects). Even if Miller goes beyond just dying his hair, he could still be well below optimal.
You can get personal attractiveness through different ways. I went in three years from being told that I never smile while dancing to being asked why I smile while dancing without being able to give a reason. It's not because I specifically worked on my smile but because I did emotional work on a deep level. At a family event yesterday someone told me that I look taller then when we last meet a while ago and I probably do appear taller than I was a year ago because my body language changed as a result of deeper work. If you become a more happy person who doesn't get anxiety because of all sorts of things that are happening around you, you will appear to be more attractive in any face to face encounter and even on photos. If I want to connect with another person I care about perceiving the reactions that the words that I speak have on the other person. If the women with whom I'm talking doesn't show any facial reactions because she's on botox that makes it a lot harder for me to connect with her. A good quote in the CBT book "The feeling good handbook" is "You can never be loved for your successes-only for your vulnerabilities. People may be attracted to you and may admire you if you are a great success. They may also resent and envy you. But they can never love you for your success." Being vulnerable is useful. If all of your bodylanguage is fake and further signals are hidden by makeup than you aren't vunerable and you make it hard for other people to love you. gwern in my you in my mind one of the few individuals who usually walks his talk. Do you think it's useful to use makeup? Do you use it yourself? Especially if you cite a paper that gender isn't very important when it comes to the effects of beauty, Given the nature of the subject it might be hard to speak openly*, but do you do other black hat stuff to manipulate the people you interacts with into finding you more attractive? *While I do promote openness I'm also willing to treat information that'
I don't use makeup at the moment, but I have two main reasons for this: I interact with few people so I expect my gains to be less than average, and I am revolted by the very idea of using cosmetics or working on my appearance. (I think it's a mix of dislike of deception, laziness, and gender norms.) The former is fine as far as it goes, but as far as the latter is concerned... I admit it is a bad reason; I've been trying to improve matters by small compromising steps which don't trigger my dislike: purchasing better-looking glasses, improving my shaving routine, more regular exercise, throwing out the worst of my clothes.
It's indeed a problem when you at the same time revolt against the idea of working on your appearance and think it's a high benefit activity. I think the solution is either to work out that working on your appearance goes against the values of yourself or to revolve the emotional issues and work at your appearance. If you walk around and it's clear that your appearance isn't optimized because you don't believe in doing so but could if you wanted to that's respectable and can be a high status move. If you appear to be trying hard to work on your appearance and fail in doing anything because you revolt against the idea of working on your appearance that no sign of social status. For myself the time when I put the most attention on my appearance was being 1 year into dancing Salsa. The activity gave me a new perception of my body and after I perceived I had internal motivation to improve. At the time I was also trying to optimize to effect other people with I'm not really anymore but I'm still not badly dressed. I'm no Zizek ;)
I'm not so sure. Women who don't wear makeup are much less attractive, which significantly reduces their social status and their dating market value. These are things people greatly value. I think confidence depends mostly on practice and genetics and situational factors. If anything, I think the superficiality-confidence connection is the other way round - being confident makes people see you as more genuine, because of the halo effect, i.e. because everybody loves to hate low-status people. People without masks are weirdos, because what people call "being normal" is a learned behavior, a mask.
... depending on the eye of the beholder.
You will get less real practice if you are walking around with a mask. If you worry what other people think about your looks to the extend that you spent 30 minutes to look presentable that will effect your confidence. Why do you believe that there a difference between men and woman in that regard? I think the fact that you separate genders has a lot to do with status quo bias. That depends a lot on the environment in which you are moving. There are corporate environments where you are expected to wear a mask and where you can't drop it completely. Yet Steve Job who was a Buddhist who meditated a lot did very well while wearing a sweater instead of dressing in a suit. Steve Job wore no makeup with is not typical for people who go in front of the camera and on big stages and have the budget for makeup stylists. For all the talk about game theory, strategy matters. If you want to play Steve Job's strategy that's not compatible with spending a lot of effort on looking attractive but instead sitting a lot and meditate. Getting bogged down in tactics isn't good.
Then again, you'll be more confident wearing a mask the more practice you have with it. I think we mean different things by "masks" though. The quote I was replying to dealt exclusively with women. That said, there is a big difference, especially if by "makeup" you mean what everyone else means by "makeup". Men are not respected more if they wear eyeliner and blusher every day. I do think that men would benefit from optimizing their personal appearance, eg by getting rid of acne, whitening teeth, dressing better, wearing heel lifts, etc. Re Steve Jobs: Giving one outlier as a counterexample does not undermine the general principle. Anyway, Jobs seems to be countersignalling: "I'm so awesome I don't even need to dress up for you to know that I'm awesome." It wouldn't have worked if people didn't already think highly of him, and I'm not even sure it worked at all. (We don't know what would have happened if he had cared more about his appearance.)
The kind of makeup that male actors on TV use isn't about wearing eyeliner. There are women who use makeup to look artificial and there are women who go for a "natural look". I don't think that a women who goes for an artificial look will get more status in a Yoga class than a women who uses less makeup. The benefits of looking artificial for women depend on the social circle in which the woman moves. While Angela Merkel does use makeup, she didn't get in a position of political power by being good looking. Using makeup to make her look very feminine wouldn't help her. A lot of men have acne because of hormonal issues. If I try to avoid to show emotions on my face, it gets tense. That reduces blood flow in my face. Less bloodflow means that it's more difficult for my immune system to clear my face of bacteria that cause acne. I don't think it's an accident that you see acne more often in asocial nerds than you see it when you look bodybuilders. The difference also isn't that the bodybuilder went to more dermatologists. Your brain will always know that you are afraid to be open if you wear a mask for the sake of impressing other people. I don't have an issue with a woman putting on makeup because she enjoys putting on makeup and don't think that will reduce confidence. I'm talking about making strategic choices. Steve Job is someone who went to Buddhist meditation retreats before he was famous. That distinguishes him. As a result he wasn't obsessed about optimizing his appearance on the makeup level. It's not a choice that in the cards for someone who did that level of deep internal work, to put on makeup to impress other people. Yes, there are women who are on that level who put on makeup because they like the activity but it comes from a very different place than putting on makeup to make people like them.
This is very interesting; is this a significant cause of acne, and if so how do you know? If this were true, we would expect that other things that decrease blood flow in the face (such as cold weather, maybe?) would also increase acne. Here are other hypotheses on acne, not sure whether they're true: * Acne is a defense mechanism employed when the body detects that one is low-status. That is, it's a way of making yourself less threatening to the rest of the tribe so that they won't slaughter you. If true, this could be mediated by status hormones like testosterone and cortisol. * Sunlight and/or shortage of vitamin D causes acne. (I have anecdotal evidence that tanning reduces acne.) * Acne is caused by weird foods, such as dairy or sugar. * Acne is caused by excessive face-washing, which screws up homeostatic processes controlling the amount of oil and water on the face. * Acne is due to evolutionary inertia: after our ancestors became hairless, they didn't have enough time or enough evolutionary pressure to evolve to excrete less oil on overly oily areas. Several of these would also explain the nerd-acne connection. Or, that connection could go the other way round, because acne could cause people to stay inside, have lower status, etc.
Cold weather might reduce your blood flow for a few hours but it will come back to normal once you are again in a warm environment. If I would run an experiment I would attempt to measure how tense the muscles in the face happen to be and how warm the skin happens to be and see whether those ratings correlates with the amount of acne. It's a working theory of myself at the moment. The background is there are techniques in the hypnosis realm for resolving "trapped emotions" which often work to help solve physical alignments. My perception of the bodies of other people is also getting better and I get better at perceiving when a certain part of the person I'm interacting with is colder and tenser than it should be. As far as self reports go, some people improve their acne by washing their faces less and others by washing it more. Hunter gatherer tribes have nearly no acne. I would be wary with an explanation that focuses on the utility of getting acne in a hunter gather tribe to explain the acne we have in Western civilisation.
This is true for certain subcultures. It is NOT true for other subcultures. And this is, of course, before we go into individual differences -- some girls are very pretty with a freshly-scrubbed look.
This crossed my mind as well, but for me spending 3-5 minutes on makeup in the morning is enough to make a substantial difference in my appearance. One has probably reached a point of diminishing returns by the time one makes it to 91 hours per year.
The best means to stop a Hitler would be to show the actual, ugly truth of where he'll lead us. Very few lies about Hitler could match the real horror.

To credibly show the truth. Claims of Hitler-equivalent societal doom are a dime a dozen. Almost all of them are false.

Almost all isn't that reassuring given the scope of the potential harm. Hitler democratically acquired power in an advanced civilized Western Christian nation while being fairly open about his terminal values. Fear of this pattern repeating is worth continually emphasizing.
The analogy isn't effective (outside the ingroup where it originates) unless it's credible; throwing it around in situations where it isn't in no way guards against the possibility of a recurrence of Nazism, or one of its less famous but often equally nasty companions in 20th-century totalitarianism. In fact, I'd say it's probably actively detrimental, as it makes the accusation less punchy when and if we do start seeing a totalizing popular movement that openly preaches extreme prejudice against an unpopular group of scapegoats. That's not to say that these kinds of mass movements aren't worth studying or analogies to modern movements can't be made; they absolutely are and can. But crying Nazi without commensurately serious justification can only cheapen the term once everyone catches on. Who cares about having one more political slur?
I think it's somewhere in Sun Tzu's Art of War. Often things are well hidden in plain sight. Hitler's biggest advantage was that nobody took him seriously.
And yet the German military didn't overthrow Hitler when he started messing up military strategy in Russia.
By that time Hitler did put people he trusted into central positions of military power. Everybody who Hitler considered to be untrustworthy was already removed from power. Nobody succeeded in running a coup against him but people did try at such dates as the 20 of July. The military didn't follow Hitlers orders when it comes to subjects such as burning brides in Germany.
A few tried, even specifically operating under the theory that the failures in Russia would make a post-assassination coup politically possible, in Operation Spark. I don't think this much affects your point, though; by the time a sufficiently evil person and/or group is in power, there doesn't seem to be any shortage of political and psychological mechanisms they can use to entrench there.
In a world with rational voters, yes. In our world you might want to start a false rumor such as Hitler's Jew hating is just a false cover for his true desire to reduce social welfare payments
That rumor wouldn't spread. It's to complicated to be a good story that's believable to the average person in that time period. I think Bruce Sterling's novel Distraction is quite brilliant at illustrating how such principles work.
I was making an analogy to Bill Clinton's false claim that Bob Dole wanted to cut medical benefits to senior citizens. When confronted with his lie by Dole, Clinton reportedly said "You gotta do what you gotta do."
It's confusing to talk about history of the 1930's with examples that come from the 1990's and which aren't marked that way. It prevents you from learning the historic lessons that the 1930's do provide.
In Tea-Party constituencies, that'd be an argument in his favor.
No, smarter voters would see the purpose of the lie and vote against Hitler. (As a tea party person I'm disassociating with Hitler.)
The Tea Party would probably support a candidate who they had reason to think wants to cut down welfare programs, even if there are some unnerving rumors about him.

Scott's examples have a fair amount of selection bias. If you take Chile, Russia, NK, or Zimbabwe, those who play dirty prevail. However, I agree that building a walled garden and making it attractive to join is a far better strategy whenever feasible.

This is class bias though. Some people are not in a position to create and live in a walled garden, either in real life or on the internet, especially in places without an internet connection. Sure its great for the kind of people who use less wrong, educated, white middle class, mainly male (as I know there are several women who post regularly), obviously internet access, lots of free time to write intense blog posts. I suppose you may cancelled this out with whenever feasible, but I also suspect the average less wrong poster would not be a good judge of feasible. I've put a lot of thought into the politics of anger and I see value and problems on both sides. Arthur Chu appears to have expressed a much more intense version than I've seen in say, Feministe or Atheism+. Although one Feministe commentator expressed her belief that she would be morally correct to murder or burn down the house of people doing things which don't justify such a response in my personal opinion. I've never come across the idea of deception in precisely the way that Scott framed Arthur Chu's comments. Is there a link to the actual comment made by Chu?
Yes. Sucks to be them. You seem to be arguing something I've seen elsewhere, and which for the sake of definiteness I'll state in the strongest form. No-one ever achieves anything through their own efforts. All "success" is due to privilege, which is oppression. So-called self-help is privilege used as an excuse to blame the unprivileged for their situation, which situation is in reality due to oppression by the privileged. Showing people what they can do for themselves is disempowerment. Blaming their troubles on the forces of privilege is empowerment. Individual action is vice. Collective action is virtue. Do your thoughts point in that direction? "A lot of thought" is not a phrase that leaps to my mind on reading that. Does any of this seeing value and problems, and having personal beliefs and opinions lead to anything but more words on blogs? I'm glad you don't think murder and arson should be freely employed, but what does it mean to say that judgement is your personal opinion? That it doesn't matter?
That paragraph sounds awful. No, I don't think that. I'll be lazy and point to John Scalzi I guess: I don't think that individual advice is useless. I'm skeptical that certain people are giving useful advice. Useful here involves a criterion of novelty. Giving someone advice they have heard 1000 times is not helpful. I guess necessary but not sufficient is a good description of personal effort in this context. Working three jobs doesn't leave a lot of time to get educated and then use that education to post large amounts of philosophical text on a Reddit like rationalist site. And being poor and black in the real world isn't the optimal condition for creating meat space walled gardens. As far as individual action, its my opinion that individual charity is helpful but not sufficient. And often comes with coercion I don't approve of. Religious charity would be a good example here. Strings attached? More like ropes. For the second thing you quoted, I wouldn't say that I am the most productive person in helping the less fortunate. Although that's somewhat for psychological and/or financial reasons. This is orthogonal to the efficacy of certain strategies to promote social change. Just because I don't turn on the faucet doesn't mean that if I did water wouldn't come out.
I think that the answer to this problem is that it will simply be neccesary for class oppression to be ended then.
Could you taboo "oppression". SJ types (and Marxists in general) love throwing around that word, but I've never seen a coherent definition beyond connoting something they disapprove of.

It is rather amusing to treat this exchange as a debate on consequentialism.

It looks like all the participants are consequentialists in good standing. The argument is over whose model of the world more accurately predicts consequences.
I mentioned on Slate Star Codex as well, it seems like if you let consequentialists predict the second-order consequences of their actions they strike violence and deceit off the list of useful tactics, in much the same way that a consequentialist doctor doesn't slaughter the healthy traveler for the organ transplants to save five patients, because the consequentialist doctor knows the consequence of destroying trust in the medical establishment is a worse consequence.

Nice. This is another confirmation of something that's becoming increasing apparent to me, and raises the same issue I've been thinking about.

I'm of the rationalist libertarian persuasion. We value truth, honesty, and a lack of coercion in human interaction. When you argue, you argue honestly. You don't lie, you admit when the other side scores points, etc. Politically, you respect the freedom and equal rights of others, and don't use force to violate those rights. But we live in a world of people who do not share those values. By our lights, these people... (read more)

The fact that we don't shoot each other literally and verbally is one thing that allows a website like LessWrong to exist.

The alternative would be splitting the website into dozen subsites: More Right, More Left, More Free, More Feminist, More Vegetarian, etc., which I suspect wouldn't remain rational for too long, although some of them might keep the word "rationality" as their local applause light.

Would that improve the world? My first guess is that these diverse websites would mostly cancel out each other, so the result would be zero. As an impact on their personal lives, they would probably spend less time studying, and more time inventing smart sounding political arguments. Which already other big parts of internet are doing, so they would be just another drop in the ocean.

Yes, here, where there is a sizable libertarian contingent, libertarians and progressives manage to be civil. And from my rationalist libertarian perspective, that's a good thing, and I wouldn't want it to change. I don't think it's good to initiate force. I think peaceful truce's are good things. But I wasn't addressing the situation at LW, I was addressing the broader context where rationalist libertarians are taking bullets, but not returning fire. I consider pacifism a loser of a strategy. A better strategy, IMO, is some kind of proportionate tit for tat. But the first step is to realize that pacifism is the current strategy, and that it's probably a loser of a strategy. I'd say the same with Nerd near social pacifism. "The great are great only because we are on our knees: Let us rise."
Well, two points come to mind. First, libertarians are by definition social pacifists if "social pacifism" is defined as refusal to the use coercion to propagate your own memes and values. Second, rational libertarians who happen to be upper-middle-class college kids living in big coastal cities -- these might be social pacifist. But I bet I can find some pretty rational pretty libertarian guys somewhere in Wyoming and they won't be pacifist at all.
Libertarians are supposed to refrain from initiating force, while pacifists refrain from using any force. That's theoretically the distinction between pacifists and libertarians. In practice politically, very little difference. As for the boys in Wyoming, I don't think they're much better. Maybe worse. Sure, if you show up with an actual gun and shoot at them, they're likely to shoot back. But for all the huffy talk about how government initiates force against them, what do they actually do to retaliate? At least City Beta Boy Snowden actually did something. If all their "eternal vigilance" amounts to is bitching and moaning when their liberty is infringed, what good are they?
Remember the context -- we're talking about persuasion in the social setting, about meme and value propagation, basically. In this context "pacifism" means "tolerance" in the sense of "you don't believe the same things as I do and that's fine".
And my point was that not only in that context, but in other contexts as well, rationalist libertarians are pacifists until the bullets flying at them are actual bullets.
I don't believe this to be true. At least according to my understanding of rationalist libertarians.

Scott just responded here, with a number of points relevant to the topic of rationalist communities.

I would assume there was supposed to be a link there?

Link here.
...and on the sidebar ("Recent on Rationality Blogs")....

Obviously, at some point being polite in our arguments is silly.

I think you seldom convince someone to change his opinion by name calling.

I once went to a talk about the implications of neurology on economics. Unfortunately for the professor who gave the talk he had a badly dressed conspiracy theorist in his audience who was upset about the professor providing a new way to justify the economic status quo. That talk would have benefited from throwing out the conspiracy theorist instead of being nice to him. The reason isn't that the conspiracy theorist ... (read more)

You probably won't convince anyone, but you can probably discourage uncommitted/future people from taking the scorned position.
The problem with using rhetoric to push people off the fence is that it's pretty hard to tell which way they'll fall.

I think that a conception of heroic morality (basically, whether or not to use TDT, or choosing between act and rule utilitarianism) may be at the heart of many of the choices to be cooperative/nice or not. Many people seem to assume that they should always play the hero, and those more virtuous ones who don't seem to think that you would never be able to play the hero.

As an example, consider assassinating Hitler. It's not clear how Hitler could reprise this -- he is already killing people who disagree with him, and he is a single tyrant while you are an invisible individual. This does not apply, however, if you are in equal factions, say Fascists and Communists.

I don't understand all the consequentialist arguments against playing dirty. If your only objections are practical, then you're open to subtle dirty maneuvers that have very high payoffs.

A really simple example of this would be to ignore articulate opponents and spend most of your energy publicly destroying the opposition's overzealous lowest-common-denominators. This is actually how most of politics works...

... and also how this conversation seems to be working, since the Scott Alexander side seems more intent on arguing through hyperbole than addressing... (read more)

Addressing the most stupid of opposition's arguments is not an enlightened way of discussion, but it's still way better than manufacturing and spreading widely false statistics. If the other side played equally dirty, we would see articles like: "Did you know that 95% of violent crimes are committed by Social Justice Warriors?" or "Woman is most likely to get raped at the feminist meeting (therefore, ladies, you should avoid those meetings, and preferably try to ban them at your campus)". [EDIT: After some thought, removed a realistic example of a specific form of attack against a specific person, because that kind of thing should not appear in LW discussions. Just leaving a hint: Imagine how a successful support for a false statistics could be used to design an ironic revenge at the very person who supported it.] I hope this sufficiently illustrates that the belief that the other side already is fighting as dirty as they can, and you cannot give them ideas by fighting dirty yourself, is completely false.
You seem to be confused. Both of the things you mentioned are examples of "playing dirty". But this is a very stupid way to play dirty because it is transparent and can backfire. Making a public example of the other side's inarticulate idiots is extremely unlikely to backfire. Just a hint: If you are using consequentialist arguments against playing dirty, then you are open to playing dirty if you can be shown it works. I submit to you that you have a failure of imagination. Strategic mimicry is not one of my arguments. You seem to be arguing with someone else. Regardless, see the "consequentialist" point above.
Simple examples of playing dirty: * Someone links a URL but it is broken in an obvious way. If you truly interested in arguing for the sake of argument, you could fix the URL and go to their link. But you could also take the opportunity to complain that they are just wasting your time and aren't really serious. * Sometimes, there is a finite amount of time or space for your opponents to reply to you in. You can pick arguments whose articulation is economic, but whose rebuttal is not. This puts a huge volumetric burden on them such that they will be unlikely to be able to reply to all your points. Later you can point out that they "ignored many of your best arguments". This is an old debater's trick. * You're going to have a live debate online for a public audience. 45 minutes beforehand, you receive an e-mail from your opponent indicating that they are having difficulty connecting to Skype and suggest the debate be moved to Omegle. You can play nice and get the debate to happen, or you can pretend that you didn't see the e-mail in time and then gloat that your opponent didn't show up because of "technical difficulties" har har har. * Abuse the last word. If you're in the final stretch of a debate, bring up new issues that your opponent cannot address because they are out of time. This technique is actually heavily penalized in high school debate competitions, but people get away with it regularly because adults are more biased than teenagers.

Just think about how much more persuasive fighting dirty sounds if the whole fate of the human race hangs in the balance. As is, there is an underlying assumption that we have infinite time to grind down our opposition with passive logical superiority.

If the fate of the whole human race hangs in the balance, then it is particularly important that the correct decision is taken, not just the one most driven by tribal feeling, loose rhetoric, etc. Therefore it is particularly important that we are able to evaluate all ideas as accurately as we can, and particularly important not to spread lies, etc. Of course, if you assume going in that your ideas are infallible, then fighting dirty can look appealing. But if the fate of the human race hangs in the balance, then you can afford the luxury of that assumption.
Okay, so, we don't know what the right answer is. But we know what the right answer ISN'T, right? We know that Westboro Baptist Church isn't going to lead the human race into a new golden age. Why not try to limit their influence? And even if there were some seemingly bad ideas that could, through some twist, actually be good ideas, there are still nonzero costs to considering them. Like if there is a 0.00001% chance it is "the answer", but a 99.99999% chance to waste everyone's time and making some people angry, we should probably discard it. Why waste time when we can pursue that handful of ideas that have a much higher chance of improving the world? I'm going to assume you meant that you can't afford the luxury of that assumption, and actually yes I can. In fact, I have no choice. I have a finite amount of computational power and if I go through all possible permutations of ideas then the probability of me coming out with The Right Answer becomes vanishingly small. Instead, I can apply some very defensible heuristics to write off huge sections of thought wholesale. I should focus my efforts on ideas that are not obviously wrong.
See Yvain's post on Schelling Fences on Slippery Slopes. You do realize that most people have the same opinion about the Singularity?
This is not a blanket reason to defend all ideologies against censorship. The analysis of many religions also implicitly assumes that there is no cost to tolerating competing religions, whereas there is a definite cost to hearing out many of the worst political ideologies. It's almost as if the slippery slope works both ways. If you can't filter anything, your energy is drained by a thousand paper cuts. I wasn't aware that the general public was angry about Singularity nerds. I was talking more about like teenage neo-nazis. Extremely high probability to contribute nothing, piss a bunch of people off, and waste all our time.
In the case of the Singularity, I'd say that most people don't consider probability and very largepayoffs.

"How dire [do] the real world consequences have to be before it's worthwhile debating dishonestly"?

M̶y̶ ̶b̶r̶i̶e̶f̶ ̶a̶n̶s̶w̶e̶r̶ ̶i̶s̶:̶

One lower bound is:

If the amount that rationality affects humanity and the universe is decreasing over the long term. (Note that if humanity is destroyed, the amount that rationality affects the universe probably decreases).

T̶h̶i̶s̶ ̶i̶s̶ ̶a̶l̶s̶o̶ ̶m̶y̶ ̶a̶n̶s̶w̶e̶r̶ ̶t̶o̶ ̶t̶h̶e̶ ̶q̶u̶e̶s̶t̶i̶o̶n̶ ̶"̶w̶h̶a̶t̶ ̶i̶s̶ ̶w̶i̶n̶n̶i̶n̶g̶ ̶f̶o̶r̶ ̶t̶h̶e̶ ̶r̶a̶t̶i̶o̶n̶a̶l̶i̶s̶t̶ ̶c̶o̶m̶m̶u̶n̶i̶t̶y̶"̶?̶

R̶a̶t... (read more)

Downvoted for the fake utility function. "I wont let the world be destroyed because then rationality can't influence the future" is an attempt to avoid weighing your love of rationality against anything else. Think about it. Is it really that rationality isn't in control any more that bugs you, not everyone dying, or the astronomical number of worthwhile lives that will never be lived? If humanity dies to a paperclip maximizer, which goes on to spread copies of itself through the universe to oversee paperclip production, each of those copies being rational beyond what any human can achieve, is that okay with you?
4Ilverin the Stupid and Offensive
Thank you, I initially wrote my function with the idea of making it one (of many) "lower bound"(s) of how bad things could possibly get before debating dishonestly becomes necessary. Later, I mistakenly thought that "this works fine as a general theory, not just a lower bound". Thank you for helping me think more clearly.