For the last several years, I've known people who've submitted articles to the EA Forum or LessWrong, and found that the culture on these sites is pretty hostile to the kinds of views they're presenting (different people have different opinions of the patterns of hostility to different views on the EA Forum and LW, respectively). What the particular views are doesn't matter, because it's been all kinds of views. What's common between them is a perception an article they wrote they believe to be quite good was "poorly received" on the site, while still having a positive and significant number of upvotes. Now, none of the articles I'm thinking of had what I would call a high number of upvotes, but it was enough that at least several people had read the article, and a majority of them had upvoted the post. As a proportion of the people who read the article, it tells us a very significant minority, typically between 30-45%, disagreed with or disliked it enough to downvote it.

So, they're articles which are not very well-received. Unless I'm missing something, an article having a positive number of upvotes should be interpreted as one's article being at least somewhat well-received by the readers. If someone thinks that an article on the EA Forum or LW has received too many downvotes, or not enough upvotes, because there is something wrong with the general culture of the respective membership bases of these sites, that is one argument. Yet that is a different one than the arguments I see typically made by those who complain about the proportion of upvotes:downvotes they receive on the EA Forum or LessWrong. They just say that based on the appearance of not receiving strong, consistent, vocal support for their articles that they would like to have seen, that the reception to their article was overwhelmingly and uniformly negative. This is in spite of the fact they received at least a higher proportion of positive:negative feedback, even if only on the measure of karma. In other words, it's my observation a lot of people who complain as if they've gotten uniformly, overwhelmingly, and inappropriately negative feedback on their articles have false impressions of how their articles were received, and are wrong.

The common tendency I see in articles like this is that there is often a high proportion of comments that disagree with or criticize the OP, and that these comments often receive more upvotes than the OP. So, it seems like what people are really upset about when their articles on the EA Forum or LW receive a merely a lukewarm reception, as opposed to an overwhelmingly negative one, is that, while there are at least a majority of the community who at least weakly supports them, there is a significant minority of the community who is more willing to strongly and vocally disagree with, criticize, or oppose them.

It seems to me one solution, to move to a better equilibrium of discussion, would be for users who agree with an original article, but are able to make the arguments for its thesis better than the original author, to write their own comments and articles that do so. There is a fear of politics among some rationalists, so that stigmatizes discussions that might appear to heighten tensions between different groups of people within effective altruism, and/or the rationality community. It's my impression in these communities there is also a culture of letting an individuals words and ideas stand on their own, and not associating the arguments from just one individual on one side of a debate to everyone on that side of the debate. So, it strikes me as unlikely the vast majority of any effective altruists or LessWrongers would care enough to join an interpersonal/intercernine online disagreement to the point the community at large should have to concern ourselves about whether we should quell it.

Of course, one of the reasons LW and the EA Forum use karma scores as they do is so for the discourse on these fora to be shaped in a way satisfying to most users, without us all having to get bogged down in endless debates about the state of the discourse itself. At least that is what I've taken the karma systems of LW and the EA Forum in large part to be about, and why I have what I believe is an appropriate level of respect for them. They're certainly not everything one should take into account for evaluating the quality of an article on LW or the EA Forum. Yet I don't think they should definitely count for little, or even nothing, which is sometimes the reaction I see from EA Forum or LW members who aren't satisfied with the proportion and kind of positive feedback their article receives, especially relative to the proportion and kind of negative feedback received.

These are my impressions as a long-time user of both the EA Forum and LessWrong. I was just wondering what other people's thoughts on the subject are; whether they're similar to, or different than, mine; and why.


New Answer
New Comment

2 Answers sorted by

For myself I find there is something really disheartening about presenting ideas in a post and then seeing it downvoted and then in the comments learning it was downvoted because a person disagreed with you because they didn't like your conclusions or what they thought you wrote or what they fear was implied by what you wrote rather than that you did something that worked against the broader conversations we are trying to have around here. It's very hard to think things through for yourself and say things that might be wrong, and I think it creates incentives that reduce exploration of ideas in favor of refinement of existing ideas (basically a move toward scholasticism) when voting causes people to feel punished when they do that.

As you note, it's especially frustrating when a comment on your post gets higher votes than your post did and that comment is a poorly reasoned objection to your post or a refutation against a strawman versions of what you said. For me this often happens by my attempting to lay out some complex, nuanced idea that was going to be difficult to explain no matter how I did it, and then someone objecting to what I perceive to be a simplified, rough version of it. And when that gets a lot of upvotes it feels like a strong signal that all attempts to say things that aren't simple extensions of what people already believe will be rejected (I especially feel this way because my recollection is that the worst of these cases usually involve hitting rationalist applause lights and then getting a lot of "applause" for doing so without actually saying much of anything).

None of this is to deny that I couldn't be a better writer or have better ideas or that I don't want critical engagement with my writing, only to say that it stings when you see voting patterns that feel more like boos and yays than voting patterns that feel like some recognition of what is most worth engaging with on the site. I would be pretty happy if someone commented on my posts raising well thought out objections or asking clarifying questions that lead me to realize I was mistaken and that got a lot of upvotes, rather than something getting a lot of upvotes that feels like it's just scoring points against me and doesn't really try to engage me or my ideas. I suspect that it's only thanks to my now strong psychological resilience that I keep on posting on LW, and I worry about who else is being silenced because they don't want to subject themselves to the harsh judgement of the crowd.

I suspect that it's only thanks to my now strong psychological resilience that I keep on posting on LW, and I worry about who else is being silenced because they don't want to subject themselves to the harsh judgement of the crowd.

If you never get downvoted, you're not being contrarian enough.

I sort of agree, but this tends to be holding the system of voting stable and assumes you're making tradeoffs along the efficiency frontier. There are probably ways to pull the voting system sideways such that you probably can optimize for more of what you care about that's currently being captured in this notion by "contrarian"ism that exists as a result of compressing ourselves down into a simple, generalized up/down vote system.

This is still good advice, though, with respect to the current system.

3John_Maxwell5y
Yeah. Upvotes/downvotes act as reward/punishment respectively. So the problem with voting to express agreement/disagreement is that you are rewarding people for expressing common views and punishing them for expressing uncommon views. Which can lead to an echo chamber. But it's still valuable to know whether people agree or disagree! So I suspect the ideal voting system would separate out the "more of this"/"less of this" axis from the "agree"/"disagree" axis. You could have people fill out text boxes anonymously to explain their "more of this"/"less of this" votes, then do text clustering once you had enough filled-out text boxes, then figure out the top 10 reasons people choose "more of this"/"less of this" and replace the text boxes with dropdowns. To guard against misuse, you could weight dropdown selections from users who tend to agree with trusted moderators more heavily.

For me, accumulated karma is mostly an indicator of how long someone's been here and how much they've participated. Common use seems to be mostly upvote; downvotes aren't rare, but a pretty neutral comment is likely to get 2-10 karma, and only a pretty bad one gets into the negative range. And posters who routinely get downvoted (for whatever reasons) likely either change or leave, so there's a strong selection toward an expectation of more upvotes than downvotes.

I find karma changes for a comment I make is somewhat useful - mostly it indicates how popular is the post I'm commenting on, but secondarily it gives me a sense of whether I'm commenting on the points that most readers find salient in the post.

I'll admit that votes carry more emotional weight than I want them to - I know they're meaningless internet points, and a rather noisy signal of popularity, but it still feels nice when something gets more upvotes than normal, and hurts a bit when I'm downvoted.

You make a good point I forgot to add: the function karma on an article or comment serves in providing info to other users, as opposed to just the submitting user. That's something people should keep in mind.

6 comments, sorted by Click to highlight new comments since: Today at 9:31 AM

Of course, one of the reasons LW and the EA Forum use karma scores as they do is so for the discourse on these fora to be shaped in a way satisfying to most users, without us all having to get bogged down in endless debates about the state of the discourse itself.

I think this topic was discussed a lot in the early days of LW (around 10 years ago, I guess!) but I haven't seen much discussion recently. So it's probably worth discussing again, at least a little bit. Here are some exploratory ramblings...

I think it's human nature to be much more sensitive to negative feedback than positive feedback. But it's probably better to consider the aggregate scores your posts are getting when deciding whether you are adding value or not. In other words, I suspect if your contributions receive more upvotes than downvotes by a comfortable margin, you should interpret that as: the aggregate view of the people reading the forum is that your contributions are valuable on net.

I also think social rewards are generally a little bit sparse in any sort of online discussion context, and you should adjust for that accordingly. Like, I'm sure I've had conversations with EAs where we talk about nothing at all for hours, and I don't feel bad about wasting their time. And I don't think I should feel bad, because it is valuable for people to connect with each other, it strengthens the EA movement, etc. And furthermore, when I talk to people in person about conversations I've had online, I typically find the in-person discussion far more rewarding than the online conversation. I'm not totally sure what I'm trying to get at here with this, but I guess one way of looking at it is that the voting system could be seen as capturing a purely informational dimension of your comment's value, which is useful for sorting/filtering/attention conservation/etc. But just because it is voted down (meaning in the judgement of others, people should not be reading it or considering it reliable) doesn't necessarily mean it was harmful for you to write. It could have been helpful if only to clarify your own understanding, or state an incorrect position in a way someone else could clearly dismantle, or even just signal based on the downvotes that many people disagree with the view you expressed. Like, the entire point of the voting system is to decrease the amount of harm that low-value contributions cause! The exception would be if your comment was itself destroying social capital, e.g. by being nasty for no reason. In that case you are bad and you should feel bad.

I guess one reason a contribution could be harmful is if it makes a bad argument for a position which is incorrect, and if this bad argument is left un-refuted, many people will read it and adjust their behavior based on it? It would appear that downvoting mostly mitigates the harms in this case because people are going to be less willing to adjust their behavior based on a downvoted comment? It could also waste the time of people who feel the need to refute your bad argument? But if the bad argument gets upvoted then it's plausibly worthwhile for someone to write a refutation anyway since apparently people are agreeing with it.

BTW, it might be worth separating out the case where controversial topics are being discussed vs boring everyday stuff. If you say something on a controversial topic, you are likely to get downvotes regardless of your position. "strong, consistent, vocal support" for a position which is controversial in society at large typically only happens if the forum has become an echo chamber, in my observation.

BTW, it might be worth separating out the case where controversial topics are being discussed vs boring everyday stuff. If you say something on a controversial topic, you are likely to get downvotes regardless of your position. "strong, consistent, vocal support" for a position which is controversial in society at large typically only happens if the forum has become an echo chamber, in my observation.

On a society-wide scale, "boring everyday stuff" is uncontroversial by definition. Conversely, articles that have a high total number of votes, but a close-to-even upvote:downvote ratio, are by definition controversial to at least several people. If wrong-headed views of boring everyday stuff aren't heavily downvoted, and are "controversial" to the point half or more of the readers supported someone spreading supposedly universally recognizable nonsense, that's a serious problem.

Also, regarding the EA Forum and LW, at least, "controversial topics" vs. "boring everyday stuff" is a false dichotomy. These fora are fora for all kinds of "weird" stuff, by societal standards. Some of popular positions on the EA Forum and LW are also controversial, but that's normal for EA and LW. What going by societal standards doesn't reflect is why different positions are or aren't controversial on the EA Forum or LW, and why. There are heated disagreements in EA, or on LW, for when most people outside those fora don't care about any side of those debates. For the examples I have in mind, some of the articles were on topics that were controversial in society at large, and then some that were only controversial disagreements in a more limited sense on the EA Forum or LW.

This is presented in such a way that I'm suspicious there's something deeper than the question (which I tried to answer in a straightforward way) that's bugging you about comment threads for a specific type or topic (or perhaps author or style) of post.

You imply that people are upset or think something's wrong, but I don't actually know if you agree or not, nor what information you're really looking for with this question. This may be one of those (surprisingly common, and anathema to typical rationalist-wannabe nerds like myself) cases where you really can't start with general solutions to specific problems, even if it looks like the problems are very similar and should have a shared cause and repeatable solution. You have to address an actual real instance of the problem half a dozen to a few thousand times before those useful generalizations can be made.


What bugs me is when people who ostensibly aspire to understand reality better let their sensitivity get in the way, and let their feelings colour the reality of how their ideas are being received. It seems to me this should be a basic skill of debiasing that people would employ if they were as serious about being effective or rational thinkers as they claim to be. If there is anything that bugs me you're suspicious of, it's that.

Typically, I agree with an OP who is upset about the low quality of negative comments, but I disagree with how upset they get about it. The things they say as a result are often inaccurate. For example, people will say because of a few comments worth of low-quality negative feedback on a post that's otherwise decently upvoted that negative reception is typical of LW, or the EA Forum. They may not be satisfied with the reception they've received on an article. That's just a different claim than their reception was extremely negative.

I don't agree with how upset people are getting, though I do to think they're typically correct the quality of some responses to their posts is disappointingly low. I wasn't looking for a solution to a problem. I was asking an open-ended question to seek answers that would explain some behaviour on others' part that doesn't fully make sense to me. Some other answers I've gotten are just people speaking from their own experience, like G Gordon, and that's fine by me too.

What's common between them is a perception an article they wrote they believe to be quite good was "poorly received" on the site, while still having a positive and significant number of upvotes.

So reception intuitively seems like it involves voting.

The common tendency I see in articles like this is that there is often a high proportion of comments that disagree with or criticize the OP, and that these comments often receive more upvotes than the OP. So, it seems like what people are really upset about when their articles on the EA Forum or LW receive a merely a lukewarm reception, as opposed to an overwhelmingly negative one, is that, while there are at least a majority of the community who at least weakly supports them, there is a significant minority of the community who is more willing to strongly and vocally disagree with, criticize, or oppose them.

Other factors: "comments".

So there are two "upsetting"/"uplifting" dimensions: votes, and comments.

If we assumed they were independent, then there are four possibilities:

1) Upvoted, positive comments.

2) Upvoted, negative comments.

3) Downvoted, positive comments.

4) Downvoted, negative comments.


1. looks like a warm reception.

2. seems to be the problem.

3. might be on par with 2, or maybe it's not as bad. It depends on whether people care about votes or comments more, or if they're unhappy if either dimension is negative.

4. probably makes people feel bad, but it seems rare - Downvoted, no/few comments, seems more common. (Although considering this brings the number of quadrants up to 9.)


Note also that displayed totals are misleading - this is definitely not one vote per reader. A vote can be anywhere from 1 to over 10, depending on karma total of the voter and whether it's a "strong" vote. For totals below 30 or so, it's mostly noise rather than signal - this is 6-8 votes out of possibly hundreds of readers.