Cross Posted at the EA Forum

At Event Horizon (a Rationalist/Effective Altruist house in Berkeley) my roommates yesterday were worried about Slate Star Codex. Their worries also apply to the Effective Altruism Forum, so I'll extend them. 

The Problem:

Lesswrong was for many years the gravitational center for young rationalists worldwide, and it permits posting by new users, so good new ideas had a strong incentive to emerge.

With the rise of Slate Star Codex, the incentive for new users to post content on Lesswrong went down. Posting at Slate Star Codex is not open, so potentially great bloggers are not incentivized to come up with their ideas, but only to comment on the ones there. 

The Effective Altruism forum doesn't have that particular problem. It is however more constrained in terms of what can be posted there. It is after all supposed to be about Effective Altruism. 

We thus have three different strong attractors for the large community of people who enjoy reading blog posts online and are nearby in idea space. 

Possible Solutions: 

(EDIT: By possible solutions I merely mean to say "these are some bad solutions I came up with in 5 minutes, and the reason I'm posting them here is because if I post bad solutions, other people will be incentivized to post better solutions)

If Slate Star Codex became an open blog like Lesswrong, more people would consider transitioning from passive lurkers to actual posters. 

If the Effective Altruism Forum got as many readers as Lesswrong, there could be two gravity centers at the same time. 

If the moderation and self selection of Main was changed into something that attracts those who have been on LW for a long time, and discussion was changed to something like Newcomers discussion, LW could go back to being the main space, with a two tier system (maybe one modulated by karma as well). 

The Past:

In the past there was Overcoming Bias, and Lesswrong in part became a stronger attractor because it was more open. Eventually lesswrongers migrated from Main to Discussion, and from there to Slate Star Codex, 80k blog, Effective Altruism forum, back to Overcoming Bias, and Wait But Why. 

It is possible that Lesswrong had simply exerted it's capacity. 

It is possible that a new higher tier league was needed to keep post quality high.

A Suggestion: 

I suggest two things should be preserved:

Interesting content being created by those with more experience and knowledge who have interacted in this memespace for longer (part of why Slate Star Codex is powerful), and 

The opportunity (and total absence of trivial inconveniences) for new people to try creating their own new posts. 

If these two properties are kept, there is a lot of value to be gained by everyone. 

The Status Quo: 

I feel like we are living in a very suboptimal blogosphere. On LW, Discussion is more read than Main, which means what is being promoted to Main is not attractive to the people who are actually reading Lesswrong. The top tier quality for actually read posting is dominated by one individual (a great one, but still), disincentivizing high quality posts by other high quality people. The EA Forum has high quality posts that go unread because it isn't the center of attention. 


New to LessWrong?

New Comment
153 comments, sorted by Click to highlight new comments since: Today at 8:38 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I've previously talked about how I think Less Wrong's culture seems to be on a gradual trajectory towards posting less stuff and posting it in less visible places. For example, six years ago a post like this qualified as a featured post in Main. Nowadays it's the sort of thing that would go in an Open Thread. Vaniver's recent discussion post is the kind of thing that would have been a featured Main post in 2010.

Less Wrong is one of the few forums on the internet that actually discourages posting content. This is a feature of the culture that manifests in several ways:

  • One of the first posts on the site explained why it's important to downvote people. The post repeatedly references experiences with Usenet to provide support for this. But I think the internet has evolved a lot since Usenet. Subtle site mechanics have the potential to affect the culture of your community a lot. (I don't think it's a coincidence that Tumblr and 4chan have significantly different site mechanics and also significantly different cultures and even significantly different politics. Tumblr's "replies go to the writer's followers" mechanic leads to a concern with social desirability that 4

... (read more)

Within an hour, I have thought of so many potential criticisms or reasons that my post might come across as lame that I am totally demoralized. I save my post as a draft, close the tab, and never return to it.

It doesn't help that even the most offhand posting is generally treated as if it was an academic paper and reviewed skewered accordingly :-p.

It doesn't help that even the most offhand posting is generally treated as if it was an academic paper and reviewed skewered accordingly :-p.

I agree. There are definitely times for unfiltered criticism, but most people require a feeling of security to be their most creative.

I believe this is referred to as "psychological safety" in the brainstorming literature, for whatever that's worth.
Agreed. This is, for me, one of the main advantages of posting on tumblr. You still get the feedback you want from clever people and criticism, but that criticism doesn't feel quite as bad as it would here, because everyone realizes that tumblr is a good space to test and try out ideas. Less Wrong feels, to me, more like a place where you share more solidified ideas (with the Open Thread as a possible exception).
I will point out that I didn't put that in Main (which is where I target the majority of the post-style content I create) because I think the first paragraph is the only 'interesting' part of that post, and it's a fairly straightforward idea, and the primary example was already written about by Eliezer, twice. This is a more serious issue, which was actually pretty crippling with the aforementioned discussion post--but that was mostly because it was a post telling people "you can't tell people things they don't know." (Yes, there's the consolation that you can explain things to people, but did I really want to put in the effort to explain that?)
Is anyone in favor of creating a new upvote-only section of LW? [pollid:988]

Proposals for making LW upvote-only emerge every few months, most recently during the retributive downvoting fiasco. I said then, and I continue to believe now, that it's a terrible idea.

JMIV is right to say in the ancestor that subtle features of moderation mechanics have outsized effects on community culture; I even agree with him that Eliezer voiced an unrealistically rosy view of the downvote in "Well-Kept Gardens". But upvote-only systems have their own pitfalls, and quite severe ones. The reasons behind them are somewhat complex, but boil down to bad incentives.

Imagine posting as a game scored in utility. Upvotes gain you utility; downvotes lose you it; and for most people being downvoted costs you more than being upvoted gains you, though the exact ratio varies from person to person. You want to maximize your utility, and you have a finite amount of time to spend on it. If you spend that time researching new content to post, your output is low but it's very rarely downvoted. Debate takes a moderate amount of time; votes on debate are less reliable, especially if you're arguing for something like neoreaction or radical feminism or your own crackpot views on t... (read more)

He isn't suggesting making LW upvote only. Just a creating a new section of it that is upvote only. And why not? If you're right the evidence will bear out that it is a terrible system. But we won't know until we test the idea.
An earlier version of my comment read "LW or parts of it". Edited it out for stylistic reasons and because I assumed the application to smaller domains would be clear enough in context. Guess I was wrong. Granted, not everything I said would apply to the first proposal, the one where top-level posts are upvote-only but comments aren't. That's a little more interesting; I'm still leery of it but I haven't fully worked out the incentives. As to empirics, one thing we're not short on is empirical data from other forums. We're not so exceptional that the lessons learned from them can't be expected to apply.
Apologies if that seemed like nitpick (which I try to avoid). I thought it was relevant because even if you are right, trying out the new system wouldn't mean making LessWrong terrible, it would just mean making a small part of LessWrong terrible (which we could then get rid of). The cost is so small so that I don't see why its shouldn't be tried.
I think the cost is higher than you're giving it credit for. Securing dev time to implement changes around here is incredibly hard, at least if you aren't named Eliezer, and changes anywhere are usually harder to back out than they are to put in; we can safely assume that any change we manage to push through will last for months, and forever is probably more likely.
Hacker News has a downvote, but you need to have 500 karma to use it. This keeps it from being used too often, and only by people very familiar with the community culture. Stackoverflow allows anyone to downvote, but you have to spend your own karma, to discourage it. HN also hides the votes that comments have. And reddit has been moving to this policy as well.
That's exactly my problem with reddit-style voting in general. Human communication, even in an impoverished medium such as forum posting, is highly, highly complex and pluridimensional. Plus one and minus one don't even begin to cover it. Even when the purpose is a quick and informal moderation system. Good post on a wholly uninteresting topic? Good ideas once you get past the horrendous spelling? One-line answers? Interesting but highly uncertain info? Excessive posting volume? The complete lack of an answer where one would have been warranted? Strong (dis)approval looking just like mild (dis)approval? Sometimes it's difficult to vote. Besides, the way it is set up, the system implicitly tells people that everyone's opinion is valid, and equally valid at that. Good for those who desire democracy in everything, but socially and psychologically not accurate. Some lurker's downvote can very well cancel out EY's upvote, for instance, and you'll never know. Maybe some sort of weighted karma system would work better, wherein votes would count more according to a combination of the voter's absolute karma and positive karma percentage. ---------------------------------------- To address your specific concerns about upvote-only systems, positive feedback expressed verbally may be boring to read and to write, hence reducing it to a number, but negative feedback expressed silently through downvotes leaves you wondering what the hell is wrong with your post and according to who. As long as people can still reply to each other, posters of cat pictures can still be disapproved of, even without downvotes. And perhaps the criticism may stick more if there are words to "haunt" you rather than an abstract minus one. However, this one strongly depends on community norms. If the default is approval, then the upvote is the cheap signal and a downvote-only system can in fact work better. If the default is disapproval, then the downvote is a cheap signal. An upvote-only policy works
Other. I do not think there is a need for a new section. Instead, we could encourage people to use tags (e.g. something like these belief tags) and put disclaimers at the top of their posts. Even though actual tags aren't very easy to notice, we can use "informal tags", such as, e.g. putting a tag in square brackets. For example, if you want to post your unpolished idea, your post could be titled something like this: "A Statement of idea [Epistemic state: speculation] [Topic:Something]" or "A Statement of idea [Epistemic state: possible] [Topic:Something]" or "A Statement of idea [Epistemic state: a very rough draft][Topic:Something]". In addition to that you could put a disclaimer at the top of your post. Perhaps such clarity would make it somewhat easier to be somewhat more lenient on unpolished ideas, because even if a reader can see that the poster intended their post to be a rough draft with many flaws, they cannot be sure if that draft being highly upvoted won't be taken by another reader as a sign that this post is correct and flawless (or at least thought as such by a lot of LWers), thus sending the wrong message. If a poster made it clear that they merely explore the curious idea, an interesting untested model or something that has only a remote possibility of not being not even wrong, a reader would be able to upvote or downvote a post based on what the post was trying to achieve, since there would be less need to signal other readers that a post has serious flaws, and therefore should not be believed, if it was already tagged as "unlikely" or something like that. Perhaps, numerical values to indicate the belief status (e.g. [0.3]) could be used instead of words. There would still be an incentive to tag your posts as "certain" or "highly likely", because most likely they would be treated as having more credibility and thus attract more readers.
Another approach would be not allowing downvote to be open to all users. On the Stackexchage network for example, you need a certain amount of reputation to downvote someone. I'd bet that a very large majority of the discouraging/unnecessary/harmful downvotes come from users who don't have above, say, 5-15 karma in the last month. Perhaps official downvote policies messaged to a user the first time they pass that would help too. This way involved users can still downvote bad posts, and the bulk of the problem is solved. But it requires technical work, which may be an issue.
Anything with messages could be implemented by a bot account, right? That could be made without having to change the Less Wrong code itself. Maybe we could send a message to users with guidelines on downvoting every time they downvote something? This would gently discourage heavy and/or poorly reasoned downvoting, likely without doing too much damage to the kind of downvoting we want. One issue with this is it would likely be very difficult or practically impossible for a bot account to know when someone downvotes something without changing the LW code. (Though it probably wouldn't require a very big change, and things could be limited to just the bot account(s).) [pollid:989]
Every time someone downvotes would probably be too much, but maybe the first time, or if we restrict downvotes only for users with some amount of karma then when they hit that level of karma?
Would you be willing to run a survey on Discussion also about Main being based on upvotes instead of a mix of self-selection and moderation? As well as all ideas that seem interesting to you that people suggest here? There could be a research section, a Upvoted section and a discussion section, where the research section is also displayed within the upvoted, trending one.
On second thought, I'll risk it. (I might post a comment to it with a compilation of my ideas and my favorites of others' ideas, but it might take me a while.)
I'd rather not expose myself to the potential downvotes of a full Discussion post, and I also don't know how to put polls in full posts, only in comments. Nonetheless I am pretty pro-poll in general and I'll try to include more of them with my ideas.
Another suggestion. Every downvote costs a point of your own karma. You must have positive karma to downvote.
Another suggestion: Every downvote costs a point of your own karma.
I don't know if I've ever read the following from an original source (i.e., Eliezer or Scott), but when people ask "why do those guys no longer post on Less Wrong?", the common response I get from their personal friends in the Bay Area, or wherever, and the community at large, is apparently, however justified or not, the worry their posts would be overly criticized by posts is what drove them off Less Wrong for fairer pastures where their ideas wouldn't need pass through a crucible of (possibly motivated) skepticism before valued or spread.
Which shows that a bug to some people is a feature to others. A lot of posts, including in the Sequences, have really good criticisms in the comments. (For that matter, a lot of SSC posts have really good criticisms in the comments, which Scott usually just ignores.) I can easily understand why people don't like reading criticism, but if you're posting for the ideas, some criticism should be expected.
All true. The key point seems to be not to aim for Main if you have some creative idea. Most creative ideas fail. That doesn't mean they were bad ideas. Just that creativity doesn't work like safe success. Main is for a specific audience and requires a specific class of writers. Why not aim for Discussion or Open Thread? Yes, these are tiers and maybe a more smooth transition were nicer but as it is that works fine.
This. My standard for what I would post on LW eventually just became too high - higher than what I would post on my own blog, and beyond justifiable effort.
This comment is great. Please cross-post the suggestions for effective altruism especially to the Effective Altruism Forum. If you don't, do you mind if I do?
Thanks! I already linked to my comment from the EA forum. If you want to signal-boost it further, maybe put a link to it and/or a summary of my suggestions in the EA Facebook group? By the way, I'm planning to write a longer post fleshing out the idea of peer-reviewed blog posts at some point.
I think he's only talking about Slate Star Codex.

I think this post misses a lot of the scope and timing of the Less Wrong diaspora. A lot of us are on Tumblr now; I've made a few blog posts at the much more open group blog Carcinisation, there's a presence on Twitter, and a lot of us just have made social friendships with enough other rationalists that the urge to post for strangers has a pressure release valve in the form of discussing whatever ideas with the contents of one's living room or one's Facebook friends.

The suggestions you list amount to "ask Scott to give up his private resource for a public good, even though if what he wanted to do was post on a group blog he still has a LW handle", "somehow by magic increase readership of the EA forum", and "restructure LW to entice the old guard back, even though past attempts have disintegrated into bikeshedding and a low level of technical assistance from the people behind the website's actual specs". These aren't really "solutions".

A lot of us are on Tumblr now; I've made a few blog posts at the much more open group blog Carcinisation, there's a presence on Twitter, and a lot of us just have made social friendships with enough other rationalists that the urge to post for strangers has a pressure release valve in the form of discussing whatever ideas with the contents of one's living room or one's Facebook friends.

I don't like this.

I do not have the time to engage in the social interactions required to even be aware of where all this posting elsewhere is going on, but I want to read it. I've been regularly reading OB/LW since before LW existed and this diaspora makes me feel left behind.

I started a thing back in March called the LessWrong Digest. First of all, to you and/or anyone else reading this who signed up for it, I'm sorry I've been neglecting it for so long. I ran it for a few weeks in March, but I was indisposed for most of April, and it's been fallow since then. It contains highlights from the blogs of rationalists who post off of Less Wrong. It doesn't contain Tumblrs yet. I'll restart it tonight. I intend to build upon it to have some sort of rationalist RSS feed. I don't know how many other rationalist Tumblrs or blogs it would include, but lots. Hopefully I can customize it.

Anyway, it's my goal to make bring such projects to fruition so no rationalist under the sun cannot be found, no matter how deep into the blogosphere they burrow.

This sounds like a great project! I approve of it. Let me know if I can help.
If you want, I can help with the tumblr part of this. If you don't need help with the tumblr part, but want to be pointed in the right direction, I host the Rationalist Masterlist with most of the tumblr rationalists on it. Also keep in mind that tumblr tends to have a very low signal-to-noise ratio.
There's a Masterlist for rational Tumblr, but I'm not aware of a complete list of all rationalist blogs across platforms. Perhaps the Less Wrong community might find it useful to start one? If it were hosted here on LW, it might also reinforce LW's position as a central hub of the rationality community, which is relevant to the OP.
I have already thought of doing this, and want to do it. I've been neglecting this goal, and I've got lots of other priorities on my plate right now, so I'm not likely to do it alone soon (i.e., by the end of June). If you want me to help you, I will. I may have an "ugh field" around starting this project. Suggestions for undoing any trivial inconveniences therein you perceive are welcomed.
Sorry for the late reply, and thanks for the offer! Unfortunately I wasn't actually talking about doing it myself, just putting it out there as an idea. Good luck though; it sounds like a valuable thing for the rationality community to have.
Curious what (in your own case, and your best estimation of other people's case) motivated the move to Tumblr?
I don't feel like I "moved to" Tumblr. I ran out of things that seemed like they'd be best expressed as LW posts and stopped being motivated by karma circa I think late 2010/early 2011 and my posting dropped off considerably. It was the end of 2012 when my sister convinced me to get a Tumblr, and I don't even mostly Tumbl about rationality (mostly). Scott has a Tumblr I think explicitly because he can dash off posts without worrying as much about quality, there; Mike has one for very similar social reasons to my own; I don't think most other people I can think of who are big on Rationalist Diaspora Tumblr were ever heavy posters on LW, although I could be missing some who don't have corresponding screen names, or forgetting someone. They're to a substantial extent different people who happened to enter the circle of interest when Tumblr was a reasonably effective way to hang out on a space with rationalists in it, and so they did that, because for whatever reason it was comfier.
I rarely bother to comment on this site but this is important meta information. Many outsider groups and rationalists in particular seem to dissolve the moment their exclusion from standard social systems is removed. The most dumbed down example I have, and I specifically desire to post as low brow and example as possible, is the episode of Malcom In The Middle titled "Morp." Its prom backwards in case you missed that. The outsider group starts an anti-prom where they do everything ironically, and amusingly have all the same status bullshit problems over who is in charge or what should even be done as the normal kids prom. Then when some random dumb popular girls come down, feel upper class girl pity, and invite them to real prom everyone but Malcolm goes. Less Wrong and its specific section of the rationalist community has approached this same singularity. It was all about getting enough like-minded and conveniently located people to form your own samesy, dull, cookie cutter clique just like normal people. Alicorn is a prime example of posts that expose this issue, although that whole cuddle pile bullshit is a more general example. Much like say Atheism+, the OB/LW community has exploded into a million uncoordinated fragments merely seeking to satisfy their standard social needs. Meanwhile each of these shards has the same number of useless, weird, counterproductive group beliefs as mainstream Christians. And they've accomplished almost nothing except maybe funding the useless MIRI, if one even considers that an accomplishment. EA people even came and said MIRI doesn't qualify for GiveWell. Indeed I feel my comparison to A+ is quite apt. So much bullshit spewed about improving stuff, raising the sanity waterline vs inclusive atheism but each group did essentially the opposite of their goal. As per my title and associated duties I here mark the collapse of "internet rationalists" as a cohesive, viable, or at all productive group. Scott has a popular blog, Elie h

This is undiplomatically expressed but may contain an important seed of useful information for anyone who would like to recentralize rationalism: meeting people's normal, boring, apey social needs is important for retention, especially at scale when it seems more tempting to split off with your favorite small percentage of the group and not put in the effort with the rest. If you want people to post on Less Wrong, what's in it for them, anymore?

(I understand the desire to scare-quote the interestingness of my dinner parties but they are, in fact, parties at which dinner is served, in the most literal possible sense.)

Indeed. Especially if the point of LW is to socialize newcomers to rationality, well, socializing newcomers is hard and not particularly glamorous work, and we're (to some extent) selecting for people who don't want to be socialized!
That's clearly not true. Alicorn again is a perfect of example of someone who clearly wanted to be socialized. I mean... dinner parties. Yes, I cannot get over the whole dinner party thing, get over it. More on point though, centralization is the ultimate bug bear of the left/progressive/radicals/w.e. Look at the internecine wars of feminism or socialism or atheism. Furthermore everyone wants to address their local personal issues first and also divides who is allowed to interfere in problems among demographic or identity lines. The success of a revolutionary movement, various religions being examples, requires both that it be more correct than what came before and that it be either equally or more satisfying. One should be careful though of copying the old systems too closely. Ethical Humanist solstice parties? Good lord what a terrible idea.
And notice how she's mostly absent on LW preferring instead to plan and arrange her dinner parties... :-P
If you say so, I barely come on here much. Today is the most active I've been in months.
I scare quoted dinner parties because they are the most ridiculously conventional upper middle class thing of all time. Even more than Valium.
Dinner parties are extraordinarily useful social tools. There's a -reason- upper middle class people do them. The causal relationship between "Being the sort of person to host dinner parties" and "Being upper middle class" doesn't flow in only one direction.
Yes but it underlines what I was saying about "Morp." And it also addresses people who were asking why I singled out Alicorn. Whenever someone tells me I'm only doing something for attention or that I only hate on certain things because I'm excluded then I say: "Thanks Captain Obvious." It throws them off a lot. People who are different are different not by choice but by force. Conventional social norms exert a massive pressure on every individual even ones with non-conforming parents/siblings/peers/teachers and the only reason why it doesn't work is because an equal or greater pressure is going the other way. So many groups, including Less Wrong, are full of so much, conscious or subconscious, self signalling and it destroys their ability to understand their own motivations or those of similar people. The original post is all uptight about content, but content doesn't matter. Socializing matters. No amount of actually thought provoking content is going to save LessWrong unless the community improves. But the communities own standards won't allow it to improve because you aren't properly regulating who is allowed to stay, among other issues, including the aforementioned issue of the community and not the content being the problem. Creating a surviving discussion website is not the same as creating a growing discussion website. I won't get into the drama that will develop if I explain what I mean about regulating who can post since you wouldn't implement my suggestion anyways. But I think many people know what I mean even if they don't agree, and we'll leave it at that.
This reads more like you're using my comment as an excuse to talk more about what you want to talk about than that you're responding in any meaningful sense to the actual content of my comment.
The first 2 sentences address what you said. The rest is a massive tangent because staying on the same train of thought is hard for me. Also I was too lazy to go through the nesting to post that in a better spot.
I'm super confused about what your point is, what your goals are, and in particular why dinner parties run counter to your goals/preferences.
What does this mean? I guess you mean "(some subset of) Alicorn's posts" (though I can't help thinking the way you've phrased it is suggestive of some kind of personal animosity), but which ones and what exactly do you think is wrong with them?
Is there directory of LessWrong Tumblr?
Yep. If you have a tumblr and want to be on the list, tell yxoque.
The solutions were bad in purpose so other people would come up with better solutions on the spot. I edited to clarify :)
The boundaries of relevante is something to think. A lot of places outside LW have discussions. Political topics was a thing back then, but now apparently people mention is Open Threads, and the most frequent talkers are still posting elsewhere. EA emerge, and with good coordination. However, this does not mean we should stop possible dynamical changes.

I think LessWrong has a lot of annoying cultural problems and weird fixations, but despite those problems I think there really is something to be gained from having a central place for discussion.

The current "shadow of LessWrong + SSC comments + personal blogs + EA forum + Facebook + IRC (+ Tumblr?)" equilibrium seems to have in practice led to much less mutual knowledge of cool articles/content being written, and perhaps to less cool articles/content as well.

I'd really like to see a revitalization of LessWrong (ideally with a less nitpicky culture and a lack of weird fixations) or the establishment of another central hub site, but even failing that I think people going back to LW would probably be good on net.

I've read some of the comments below, and I'm thinking both for your own use and further discussion it will help to distinguish between different sorts on Less Wrong by reading this post by Ozy Frantz.
Not that they aren't here, but which ones are you talking about? What's a weird fixation to some might be an attractor for others, and visa-versa.

In terms of weird fixations, there are quite a few strange things that the LW community seems to have as part of its identity - polyamory and cryonics are perhaps the best examples of things that seem to have little to do with rationality but are widely accepted as norms here.

If you think rationality leads you to poly or to cryo, I'm fine with that, but I'm not fine with it becoming such a point of fixation or an element of group identity.

For that matter, I think atheism falls into the same category. Religion is basically politics, and politics is the mind-killer, but people here love to score cheap points by criticizing religion. The fact that things like the "secular solstice" have become part of rationalist community norms and identity is indicative of serious errors IMO.

For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.


For me, one of the most appealing things about EA (as opposed to rationalist) identity is that it's not wrapped up in all this unnecessary weird stuff.

I'd consider EA itself to be one of those strange things that LW has as part of its identity. It's true that EA involves rationality, but the premises that EA is based on are profoundly weird. I have no desire to maximize utility for the entire human race in such a way that each person's utility counts equally, and neither does just about everyone else outside of the LW-sphere. I prefer to increase utility for myself, my family, friends, neighbors, and countrymen in preference to increasing the utility of arbitrary people. And you'll find that pretty much everyone else outside of here does too.

I don't view this as inconsistent with EA. I basically share the same preferences as you (except that I don't think I care about countrymen more than arbitrary people). On the other hand, I care a non-zero amount about arbitrary people, and I would like whatever resources I spend helping them to be spent efficiently. (Also, given the sheer number of other people, things like scientific research that would potentially benefit everyone at once feel pretty appealing to me.)
Well, that's a matter of semantics. I could say "I don't want to maximize utility added up among all people", or I could say "I assign greater utility to people closer to me, and I want to maximize utility given that assignment". Is that EA? If you phrase it the second way, it sort of is, but if you phrase it the first, it isn't. Also, I probably should add "and people who think like me" after "countrymen". For instance, I don't really care about the negative utility some people get when others commit blasphemy.
this was an unhelpful comment, removed and replaced by the comment you are now reading
I think there are plenty of people out there who do care to some extend about saving starving African children.
Yes, they care to some extent, but they would still prefer saving their own child from starvation to saving another child in a distant continent from starvation. Caring to some extent is not equally preferring.
I don't think any of the EA people wouldn't care more about their own child. To me that seems like a strawman.
The argument usually goes in reverse: since you'd care about your own child, surely you should care equally about this child in Africa who's just as human. It's presented as a reason to care more for the distant child, not care less for your own child. But it still implies that you should care equally about them, not care more about your own.
I don't know any EA who says that they have an utility function that treats every child 100% equally.
So, maybe this is just my view of things, but I think a big part of this conversation is whether you're outside looking in or inside looking out. For example, I'm neither poly nor signed up for cryo, but I'm open to both of those things, and I've thought them through and have a balanced sense of what facts about the world would have to change for my identification / recommendations to have to change. In a place where most people have seriously considered the issue, that gets me no weird looks. But saying "I'm open to cryo" to an audience of stereotypical skeptics comes across as an admission of kookery, and so that's the relevant piece about LW they notice: not "they don't scoff at ideas" but "they believe in cryonics more than normal." Is that true? I mostly don't notice people scoring cheap points by criticizing religion; I mostly notice them ignoring religion. Mmm. I would say that "religion is basically community"--they're the people you spend a lot of time with, they're the people you have a shared history / myth base with, they're people you can trust more than normal. And any community, as it becomes more sophisticated, basically becomes a 'religion.' The Secular Solstice is part of making a genuine sophisticated rationalist community--i.e., a rationalist religion, of the "brownies and babysitting" variety rather than the "guru sex cult" variety.
I'm on the inside and I think we should get rid of these things for the sake of both insiders and outsiders. See for instance Raising the Sanity Waterline, a post which raises very important points but is so unnecessarily mean-spirited towards religion that I can't particularly show it to many people. As Eliezer writes elsewhere:
Seems to me we have to differentiate between two things: a) x-rationality (rationality without compartmentalization) b) LessWrong x-rationalist culture Rationality means thinking and acting correctly, not doing stupid stuff. Culture means creating an environment where people feel comfortable, and are encouraged to do (what the culture considers to be) the right thing. There is only one rationality, but there can be multiple rationalist cultures. Different cultures may work better for different people. But different people cannot have different definitions of rationality. Seems to me that polyamory is a clearly cultural thing, atheism is a part of rationality itself (not believing in magic, not accepting "mysterious answers", reductionism), and cryonics is... somewhere in between, these days probably more on the cultural side. Secular solstice is obviously a cultural thing, and in my opinion not even a central component of the traditional LW culture; although it's obviously related. I love the "old good hardcore LessWrong rationalist culture", and I would be sad to see it disappear. I want it to survive somewhere, and LW seems like the logical place. (I mean, where else?) But I don't want to push it on other people, if they object. I enjoy it, but I can understand if other people don't. I support experimenting with other rationalist cultures. Not sure what is the solution here. Maybe making the cultures more explicit? Giving them names? Yes, this encourages tribal thinking, but on the other hand, names are Schelling points. (And if we don't have an explicit name for the culture, people will simply use "the rationalist community" as a name, and then there will be confusion when different people will try to define it differently, when what they really mean is they prefer different cultures.) Actually, this could be an interesting topic for a separate discussion: Do we need a rationalist culture? What kinds of cultures (that we could consider rationalist) alread
I don't notice Less Wrong users bashing religion all the time. At some point in the past, there may have been more overlap with New Atheism, but because there are no new points being made in that domain these days, among other reasons, I don't observe this as much. Mind you I could be biased based on how I spend less time on Less Wrong the website these days, and spend more time discussing with friends on social media and at meetups, where bashing religion seems like it would take place less often anyway. Mentally, I've switched out "politics is the mind-killer" for "politics is hard mode". That article was originally written by Robby Bensinger, and I think it works better than the original sentiment, for what it's worth. I perceive the secular solstice as part of the rationalist community being a step away from the public atheism and skeptic communities, at large. While in many skeptic circles, or among casual atheists, people I know seem grossed out by the elements of piety and community devotion, it seems to me the rationalist community embraces them because they understand, psychologically, replicating such activity from organized religion can engender happiness and be empowering. The rationalist community may be able to do so without receiving all the fake and false beliefs which usually comes with the territory of organized religion. In embracing the secular solstice, perhaps the rationalist community isn't afraid of looking like a bunch of clowns to achieve their goals as a social group. On the other hand, the secular solstice could be too heavy-handed with symbolism and themes of anti-deathism and transhumanism. I haven't attended one. I know there were big ones in Seattle, New York, and Berkeley in 2014, and I think only the latter was so overtly steeped in transhumanist memes. I could also have more sentimentality for the of a "secular solstice" than most non-religious folk, as I seem to perceive more value in "spirituality" than others.

Note how all the exodus is to places where people own their particular space and have substantial control over what's happening there. Personal blogs, tumblrs, etc. Not, say, subreddits or a new shinier group blog.

Posting on LW involves a sink-or-swim feeling: will it be liked/disliked? upvoted/downvoted? many comments/tepid comments/no comments? In addition, you feel that your post stakes a claim on everybody's attention, so you inevitably imagine it'll be compared to other people's posts. After all, when you read the Discussion page, you frequently go "meh, could've done without that one", so you imagine other people thinking the same about your post, and that pre-discourages you. In addition, a few years' worth of status games and signalling in the comments have bred to some degree a culture of ruthlessness and sea-lawyering.

So, these three: fretting about reactions; fretting about being compared with other posts; fretting about mean or exhausting comments. One way to deal with it is to move to an ostensibly less demanding environment. So you post to Discussion, but then everyone starts doing that, Main languishes and the problem reoccurs on Discussion. So you post to... (read more)

So, I have lots of thoughts and feelings about this topic. But I should note that I am someone who has stayed on LessWrong, and who reads a sizable portion of everything that's posted here, and thus there's some difference between me and people who left.

In order to just get this comment out there, I'm going to intermingle observations with prescriptions, and not try to arrange this comment intelligently.

Individual branding. There are lots of benefits to having your own site. Yvain can write about whatever topics he wants without any concern about whether or not other people will think the subject matter is appropriate--it's his site, and so it's what he's interested in. As well, people will remember that they saw it on SSC, rather than on LW, and so they'll be much more likely to remember it as a post of his.

This could be recreated on LW either by giving post authors more control over the page appearance for things they post (a different page header?), having author / commenter images, or by shifting the "recent on rationality blogs" from a sidebar to a section of similar standing to Main and Discussion. I must admit I haven't used reddit much, but I'm of the impressio... (read more)

Out of curiosity, why have you stayed, why do you read as much as you do, and how are you different?
I suspect I find reading and posting on forums more intrinsically motivating than most people; this was one of my primary hobbies before LW, and it will likely be one of my primary hobbies after LW. LW was just the best forum I had found.

People seem to be complaining about community fracturing, and good writers going off onto their own blogs. Why not just accept that and encourage people to post links to the good content from these places?

Hacker News is successful mainly because they encourage people to post their own blog posts there, to get a wider audience and discussion. As opposed to reddit where self promotion is heavily discouraged.

Lesswrong is based on reddit's code. You could add a, and just tell people it's ok to publish links to whatever they want there. This could be quite successful, given lesswrong already has a decent community to seed it with. As opposed to going off and starting another subreddit, where it's very hard to attract an initial user base (and you run into the self promotion problem I mentioned.)

Potentially worth actually doing - what'd be the next step in terms of making that a possibility? Relevant: a bunch of us are coordinating improvements to the identical EA Forum codebase at and
You'd need to convince whoever runs Lesswrong. There was some other discussion in this thread about modifying the code, but no point in doing that if they aren't going to push it to the site. Otherwise there is /r/RationalistDiaspora which is attempting to fill this niche for now.
Getting agreement from MIRI (likely Eliezer) that LW should be changed in that way.

Hey all,

As the admin of the effective altruism forum, it seems potentially useful to chip in here, or at least to let everyone know that I'm aware of and interested in this kind of conversation, since it seems like mostly everything that needs to has already been said.

The statement of the problem - online rationalist discourse is more fractured than is optimal - seems plausible to me.

I think that SSC and Scott's blogging persona is becoming quite a bit bigger than LessWrong curently is - it's got to the stage where he's writing articles that are getting thousands of shares, republished in the New Statesman, etc. I think SSC's solo blogging is striking a winning formula and shouldn't be changed.

For the EA Forum, the risk has always been that it would merely fracture existing discussion rather than generating anew any of its own. People usually think enough about how their project could become a new competing standard because they have a big glorious vision of how it would be. The people who are enthusiastic enough to start a project tend to be way out on the bell curve in terms of estimating how successful it is likely to be, so it can be unthinkable that it would end up as 'just an... (read more)

I wonder if we should be distinguishing between essays and discussions here. The subreddit might end up fracturing discussions by adding a new place to comment, but unifying essays by adding a place to find them without needing to subscribe to everybody's personal blog.

FYI: I've just made this:

See: discussion in this thread.

A possible dark explanation:

-The main reason people cared about lesswrong was that Scott and Elizier posted on lesswrong. Neither posts on lesswrong anymore. Unless some equally impressive thinkers can be recruited to post on LW the site will not recover.

I'll weigh in and say that neither Scott nor Eliezer were much of an incentive for posting on LW. Mostly I like the high standards of discussion in the comments, and the fact that there is a much lower inferential distance on many important topics.
Yeah, strangely Yvain, lukeprog, and Eliezer definitely weren't my favorite writers on LW but perhaps the volume of their contributions led to positive network effects.

mm.., I think and agregator from less wrong, SSC , EA forum and OB posts, would be great,only if all of the formers have an easy (visible) link to it. It could allow more traffic to flow between those gravity centers. it may be better than crossposting.

ESRogs just made this.

Pros of having it on Reddit:

  1. It's a clearly neutral place, with no history or baggage.
  2. It's a bit more cleanly set up for link posts.
  3. Instead of a potentially costly change to the LW codebase, it's already done.

Cons of having it on Reddit, instead of on LW (see this other comment of mine for suggestions on how that could be done):

  1. It requires a different account, and a new account for anyone who doesn't already use Reddit.
  2. It doesn't inherit the good parts of the history, like tying the Yvain of SSC links to the Yvain of Generalizing From One Example.
  3. It creates a new source of gravity, potentially diffusing things even more, rather than consolidating them. Instead of conversations in SSC comments and tumblr and Facebook and a LessWrong link post, we now might have conversations in SSC comments, tumblr, Facebook, a LessWrong link post, and Reddit.
I would be surprised if that subreddit get traction. I was thinking something more like Reaction Times(damn Scot and his FAQ), and having it in a visible place in all of the Rationality related sites. a coordinanted effort. Well, the idea was not to comment in the agregator, that way it will be like a highway, it should take you to others sites with 2 clicks (3 max) . if that is not possible I'm not sure there will be any impact, besides making another gravity center.
I'm thinking about whether to try to explicitly establish this as a norm of /r/RationalistDiaspora. Haven't made up my mind yet.
Comments in the aggregator makes much more sense to me--no trivial inconvenience to posting a comment, and people can read the comments to determine whether or not to follow the link, and it means every link has access to Reddit-quality commenting (karma, threads, etc.) regardless of how the source is set up. It does make it harder for the content creator to see those comments.
but in that case the people will be even more diluted, why create another gravity center?, that´s the issue we are trying to solve, I'm mostly convinced that t would be better if the aggregator have no comments. Edit: I guess the aggregator have more traffic than I thought, I'm just worried if there is only a one way flow from less wrong all the other sites..
It might also make sense to have multiple parallel discussions with different norms, so that people who are turned off by one set of norms can still comment elsewhere. (This does run the risk of fragmentation.) ...though I'd suggest that if we're going to discuss the comment policy of the new place, we should do that in a meta thread at the new place.

I agree with the comments (like John Maxwell's) that suggest that Less Wrong effectively discourages comments and posts. My karma score for the past 30 days is currently +29, 100% positive. This isn't because I don't have anything controversial to say. It is because I mostly stopped posting the controversial things here. I am much more likely to post them on Scott's blog instead, since there is no voting on that blog. I think this is also the reason for the massive numbers of comments on Scott's posts -- there is no negative incentive to prevent that there... (read more)

Leave Main for articles introducing concepts needed to discuss things (like 'bias') and Discussion for specific examples of using these concepts, and divide the Open thread into Quantitative models, Developing models and Anecdata.

I think "LW type" rationalists should learn to be colleagues rather than friends. In other words, I think the win condition is if you agree on the ideals, but possibly bicker on a personal level (successful academic communities are often like this).

Sorry, this was an useless post so now it's gone
There are brain-imposed bounds on movement growth if you insist on staying in your cuddle pile. Why conflate your personal social goals, and general movement goals?
sorry, this was an unhelpful comment that is now gone :)
They are not necessarily, but they are in this case. I think Scott once mentioned that BA rationalists can't grow beyond about 150. 150 is a magic number, and is suggestive of what the problem might be. ---------------------------------------- "Cuddle pile" is my slightly unkind shorthand for the kinds of social peculiarities rationalists, imo, should leave behind if they want the ideas to become more mainstream. ---------------------------------------- Metacomment: "it is not necessarily the case that X" is almost always true for interesting X.
I suspect most rationalists will turn out to care more about their cuddle piles than about their ideas becoming mainstream. There's always been a rather unhealthy interaction between community goals and the community's social quirks (we want to raise the sanity waterline -> we are saner -> our quirks should be evangelized), and we don't really have a working way to sort out what actually comes with increased rationality and what's just a founder effect.
I agree. And that's too bad. ---------------------------------------- I have been trying to serve as a bit of a "loyal opposition" re: separating rationality from social effects. But I am just one dude, and I am biased, too. Plus, I am an outsider, and my opinions don't really carry a lot of weight outside my area of expertise, around here. The community itself has to want it, on some level.

Honestly, my maginal returns of spending time on LW dropped drastically since I finished reading the sequences. Attending local meetups was kinda fun to meet some like-minded people, but they inevitably were far behind in the sequences and for the most part always struck me as trying to identify as a rationalist rather than trying to become more rationalist. This strikes me as the crux of the issue: LW has become (slash might have always been) an attractor of nerd social status, which is fine if that's its stated goal, though this doesn't seem to be the is... (read more)


If you have drafts you think are not good enough for LW, then polish them, include the criticisms of which you can think, make a falsifiable prediction and GO POST THEM ON YOUR OWN BLOG. Link to LW articles on specific biases that could have guided your thoughts if you can identify them. You do not owe anyone anything, and if you write well enough, you will have readers. Make your own rules, change them when you need to, hell, STOP BLOGGING if you don't feel the need.

It does not mean that you have to leave LW. Comment, post, IGNORE KARMA HITS, comment on Y... (read more)

I'm surprised by this idea of treating SSC as a rationalist hub. I love Scott, Scott's blog, and Scott's writing. Still, it doesn't seem like it is a "rationality blog" to me. Not directly at least. Scott is applying a good deal of epistemic rationality to his topics of interest, but the blog isn't about epistemic rationality, and even less so about practical rationality. (I would say that Brienne's and Nate's 'self-help' posts are much closer to that.) By paying attention, one might extract the rationality principles Scott is using, but they're ... (read more)

A few tangential ideas off the top of my head:

If the moderation and self selection of Main was changed into something that attracts those who have been on LW for a long time, and discussion was changed to something like Newcomers discussion, LW could go back to being the main space, with a two tier system (maybe one modulated by karma as well).

  1. People have been proposing for a while that we create a third section of LW for open threads and similar content.

  2. We could have a section without any karma scores for posts/upvote only, though we could still ke

... (read more)
Those are all phrased as "do you agree that people are saying X" or "do you agree that we could X" rather than "is X a good idea".
Good point, thanks. I was already not a fan of the way the polls made the post look, so I went ahead and took them down. I could replace them with something better, but I think this thread has already gotten most of the attention it's going to get, so I might as well just leave the post as it is.

People enjoy writing elsewhere more because they don't have to write about "refining the art of human rationality," which is the stated topic and purpose of LW. Actually making progress on this topic is difficult and fairly dry. If you're concerned that we're missing out on the rationality-relevant content they post elsewhere, just ask them for permission to repost on LW. I know this is already happening with some Slate Star Codex posts.

I'm not exactly a top-tier contributor, but my writings here tend to get positive responses, and the reason I don't write more is chiefly lack of ideas. One thing I'm doing is resolving right now to try to write more on LW; another is resolving to be willing to post a broader variety of things until I actually get some negative feedback that I should narrow.

But as far methods external to myself, I wonder if something like a topic of the month could seed participation. Maybe do posts with discussion questions--I actually really enjoyed these on the Superintelligence reading group posts.

This is not a well thought out post, in keeping with the nature of the subject matter. Less Wrong does seem to encourage solidified thoughts rather than subconscious reactions. A good thing, I think, but difficult all the same. Ideas follow.

  • An IRC-style (not necessarily chat) section which has neither votes nor a delineation between post and comment. An area for LWers to post thoughts as they occur. Restrict formatting of these posts to plain text. Not a design choice, so much as to encourage train-of-thought style conversation.
  • Why upvotes at all? Why not a well defined rating scheme, in addition to use of belief tags in standalone Main and Discussion posts?

I feel the need to go a bit meta.

A bunch of people here expressed discomfort with downvoting. Essentially, they are saying that the likelihood of criticism -- either overt (the post gets skewered) or covert (the post gets silently downvoted) -- discourages them from doing things such as posting content.

Let me agree that this is a problem. It's a problem of being thin-skinned and it's a big problem for these people. The thing is, real life is not a support group full of nice boys and girls with gold stars for everyone and no criticism ever because it migh s... (read more)

No... but real life is a place where people can take their balls and go home, because they don't want to play with you anymore. Eliezer doesn't have to post to LW; Yvain doesn't have to post to LW; interesting people can just go elsewhere and do things that are more fun for them, and the more interesting they are, the more likely they are to have other options. Yes, there should be a place on LW where people can ruthlessly skewer technical ideas, but empirically we are massively losing out by limiting the audience of LW to TOUGH GUYS who can HANDLE CRITICISM.
First, not audience but content creators, but second, is this so? Did any of the really valuable contributors to LW go away because they were driven away by incessant criticism? You think Scott Alexander moved to SSC because he couldn't handle the downvotes? The general cry here seems to be "We want more content!". Well, I don't want more content. I have a whole internet full of content. What I want is more high-quality content that I do not need to search through piles of manure to find. The great advantage of LW is that here pearls are frequent but bullshit is rare -- and I attribute this in not a small degree to the fact that you'll be punished (by downvotes and comments) for posting bullshit. A system without downvotes encourages posting, true, but it encourages posting of everything including cat pictures and ruminations on a breakfast sandwich in three volumes. Someone has to do pruning and if you take this power away from the users, it'll fall to the moderators. I don't see why this would be better -- and people whose cat got disrespected will still be unhappy.
Didn't Eliezer say somewhere that he posts on Facebook instead of LW nowadays because on LW you get dragged into endless point-scoring arguments with dedicated forum arguers and on Facebook you just block commenters who come off as too tiresome to engage with from your feed?
As far as I understand (it isn't very far), Eliezer prefers Facebook basically because it gives him control -- which is perfectly fine, his place on FB is his place and he sets the rules. I don't think that degree of control would be acceptable on LW -- the local crowd doesn't like tyrants, even wise and benevolent.
On the LW facebook group Eliezer bans occasionally bans people who post really low quality content. The same goes for his own feed. If Eliezer would bans someone on LW on the other hand he would get a storm of criticism.
I'm curious what solution would work here. Suppose you had a list of ~10 users with 'censor' power, and the number of censors who have 'remonstrated' a user is public, possibly also with the remonstrations. "Don't be a jerk," or "don't promote other sites in your early posts," or "think before you speak," or so on. If a sufficient number of censors have remonstrated a user, then they're banned, but censors can lift their remonstration once it's no longer appropriate. Thoughts on this solution: 1. Reasoning is clear and transparent, and gradual. Instead of "all clear" suddenly turning to "can't post anymore," people are put 'on notice.' 2. If which censor has remonstrated a user is hidden, it isn't "Eliezer" using his dictatorial powers; it's some moderator moderating. 3. If which censor has remonstrated a user is hidden, the drama might multiply rather than decrease. Now an offending user can message the entire group of censors, pleading to have their remonstration removed, or complain bitterly that clearly it was their enemy who is a censor, regardless of whether or not that was actually the person that remonstrated with them. 4. If three out of ten moderators agree that a poster should stop posting, then it becomes much easier to defend the action to remove the poster.
That's a bureaucratic solution. But it doesn't really get at the heart of the issue. Eliezer acts that way because of the Roko affair and people telling him that he shouldn't have moderated. In that case the decision being made by three people instead of one wouldn't have made it more defensible. This forum currently has MIRI ties that make controversial moderating decisions reflect badly on MIRI. A solution would be to cut those ties and give LW into the hand of a small group of moderators who are more free to focus on what's good for the community instead of larger PR effects.
He did explicitly point out that this culture of criticism / high standards makes writing for LW a chore, and so he doesn't do it anymore. So, yes. I am not advocating for the removal of downvotes; I think they serve a necessary function, and I think having some sort of pruning and sorting methodology is a core site feature. But to cultivate good content, it is not enough to just remove bad content.
Let's bring in the entire quote. Yvain said: Note that the first three points have nothing do with criticism. The fourth point is the requirement to show evidence which still isn't criticism. And the final point I read as having to be literal and formal with little "free play" in the moving parts -- I think there is a connection with the recent series of posts by Jonah Sinick where he talks how gestalt pattern recognition is, at certain level, superior to formal reasoning (and LW expects formal reasoning). Yeah, I still think Scott Alexander could handle the downvotes just fine. I agree, but the suggestions offered tend to gravitate to "Let's just be nice to everyone"... What kind of positive incentives to creators of high-quality content can LW come up with?
The thing is, the high standards on LW that Yvain refers to are precisely what makes LW content valuable. At some level, wanting to escape requirements such as airtight reasoning means you want to write stuff that doesn't have airtight reasoning.
Yes, I agree. That's why I think "more content" is the wrong yardstick. I want "more high-quality content" which you don't get by relaxing standards. Correct, but that's fine. There is a lot of high-quality and valuable stuff that is not airtight-reasoned.
I've refrained from posting because I expected to get really banal criticism. You may or may not consider that a loss. But I kind of get the impression that Scott feels somewhat similarly. It's not like he doesn't get criticized on SSC. I think this isn't a case of me needing to HTFU. (Other self-modification would have worked, but it would also not be very useful outside of LW.) So it may not be relevant to what you're trying to say. But I also wonder whether other people feel similarly, and are expressing it in ways that you're interpreting as them needing to HTFU.
0Lumifer9y just ignore it? I think yours is a different case -- it's as if you want better readers than the LW crowd. Would you be fine with insightful and to-the-point skewering?
I don't think "better" readers would be a helpful way to frame it. There are lots of dimensions of quality. E.g. one of the HN comments said which is a bad comment in a way that I don't think would get traction on LW. I think... maybe one factor is comments that are bad because they're wrong, and comments that are bad because they're right but, really, who cares? Like jaywalking in front of a policeman who then stops you, gives you a stern lecture, and you have to say yes officer and no officer and so on. It feels more like a power trip than an actual attempt to make me or anyone else safer. If insightful and to-the-point skewering was justified, then I wouldn't enjoy it and it might put me off future posting (and maybe it should), but I hope I would find it valuable and take it as a sign that I needed to level up.
So, nit-picking? Yes, it's popular on LW :-/ but (a) you are still free to ignore those; and (b) as opposed to the example with the cop, there is no inherent power imbalance. Nothing prevents you from going meta and pointing out the difference between what is important and what is not. Do I read you right in that you want more co-travelers in figuring out problems and solutions and less critics who carefully examine your text for minor flaws and gotchas, basically?
On reflection, I'm not sure that nitpicking is quite the problem that I'm pointing at, but I don't think I have a very good handle on what is. (I do think nitpicking is a problem.) Maybe next time I have that feeling, I'll just post anyway and see what happens.
It often takes a special effort to -notice- that a criticism isn't meaningful, especially when it is correct - especially because Less Wrong entertains a -much- higher level of pedant than will generally be encountered elsewhere. More problematically, pedantry tends to get upvoted, which means people may pay too much attention to it, and also that it is being encouraged. If we're interested in discouraging pedantry-for-the-sake-of-pedantry, I'd lean towards implementing an applause-lights keyword to indicate that a criticism may be valid, but doesn't actually add anything to what is being said, along the lines of how "Updating" was used as an applause-lights keyword to counterbalance the generally negative attitude people start with towards admitting wrongness.
True -- but I think it's a very useful skill to develop and practice. And that is probably a feature of the local culture by now, heavily supported by the meme of how you can't make even one tiny little itty bitty mistake when programming the AI because if you do it's all paperclips all the time. I call such things "technically correct, but irrelevant", but I don't think this expression functions well as an applause-lights switch. Ideas?
The best opposite to "pedantry" I can come up with is "pragmatic." Pragmatism is a relatively good value on Less Wrong, but I don't see a good application. Yours seems good. It concedes the argument attempted to be raised, shutting off further discussion - a very desirable quality when dealing with somebody who is specifically looking for something to argue with - and rebuts the fundamental problem, redirecting future attention there. (Minor shift for reasons I have trouble explicating, but which seems a stronger, slightly harsher version of the sentiment - "Technically correct. Also irrelevant.") If it's used appropriately, and consistently, I think it could become an applause-light within the sub-culture here.
Ah, a vote in favour of strife. Yes, that's what it is. If you start off from the premise of a world full of unfair, mean, nasty people, you do still have the choice of either adapting by joining their ranks, or ensuring that the patch of reality you control remains well-defended from the corruption. This is a very useful matter to conceive of in terms of tendencies. What to promote? Harmony, or strife? You're pushing for more strife now in what seems to me you conceive of as overzealous pro-harmony efforts, but with that attitude I have no guarantee that you won't push for strife even further. Even with the talk of balances and all. Ironically enough, I do think that LW is pretty balanced in that regard (with some outliers, of course), so on the surface, I agree that downvotes shouldn't be having a great emotional impact on a reasonably stable individual. It's the attitude that begets disapproval, not the facts as they now stand. You'll still be more comfortable with trolling rather than with sensitivity even after this bout of excess sensitivity might have passed or been successfully countered. There are 1) better and 2) enough venues for getting acquainted with the harsher realities of the world. Why would anyone try to make more of them out of milder spaces is beyond me. But I suppose conflict is another one of those acquired tastes. On the topic of negative feedback: This line right here illustrates the belief that the dignified way to deal with mean-spirited criticism is never to internalise it; presumably to have / express a low opinion on the criticiser right back? Criticism, in order to serve some useful purpose besides just creating tension between people, has to be listened to, otherwise it's just a pointless battle between my pride and yours. Who knows, maybe the person really is an idiot who should never ever try anything like he/she did again. If the local culture has it that that option is never even up for consideration, every attempt at criticis
Nope, that's what it is not. That specific comment is really not about LW voting system at all, it's about people's ability to take criticism (of various sorts including totally unfair one) and the usefulness of such an ability. Still nope, even in the context of LW karma that's the wrong framework. Negative feedback is not strife -- if you screwed up and no one will tell you so because it's not nice, you will continue to screw up until reality delivers the message to you. Feedback and consequences is a much more useful set of terms to use. LOL. Would you like to... adjust my attitude? X-D You're missing a very important part: distinguishing between the criticism of an idea or a proposal, and the criticism of a person. You should listen and pay attention to the criticism of your ideas. You should not interpret the criticism of your ideas as criticism of your self/identity/personality/soul/etc.

Nate Soares' blog seems excellent, of what I've read. I don't read all of it. He posts approximately once or twice per week, and writes his blog posts in the form of sequences, like Eliezer or Luke have done in the past. He doesn't seemed to have slowed in recent weeks in coming into his role as executive director of MIRI. I'm unsure if he'll blog less frequently as he comes into his new role at MIRI in full. Anyway, if he intends to keep blogging every couple weeks, you/we could ask him to cross-post as many blog posts as he feels like to Less Wrong, as m... (read more)

He's already cross-posted several, but I don't see this solution working long-term, or generalizing to many people, unless it is technically very easy.

Another piece of the rationalist diaspora is neoreaction. They left LW because it wasn't a good place for talking about anything politically incorrect, an ever expanding set. LW's "politics is the mindkiller" attitude was good for social cohesion, but bad for epistemic rationality, because so many of our priors are corrupted by politics and yesterday's equivalent of social justice warriors.

Neoreaction is free of political correctness and progressive moral signaling, and it takes into account history and historical beliefs when forming priors abo... (read more)

I think I learned what I needed to learn about Moldbug and neoreaction based on his reaction to Scott's post. "Intellectual progress" is when you engage with your critics.
Scott focused heavily on engaging Michael Anissimov's positions, and he did reply to them.
Agreed. It would have been interesting to see a back and forth between those two. Scott's open-mindedness would have made him an ideal interlocutor for Moldbug; he missed a great opportunity there.
I think many people would have loved to see a response by Moldbug, and found his response disappointing. My guess is that Moldbug felt that his writings already answered a lot of Scott's objections, or that Scott's approach wasn't fair. And Moldbug isn't the same thing as neoreaction; there were other responses by neoreactionaries to Scott's FAQ. The FAQ nails neoreaction on a lot of object-related issues, and it has some good philosophical objections. But it doesn't do a good job of showing the object-related issues that neoreaction got right, and it doesn't quite do justice to some ideas, like The Cathedral and demotism. And the North Korea stuff has really easy to anticipate objections from neoreactionaries (like the fact that it was lead by communists). The FAQ answers the question "what are a bunch of objections to neoreaction?", but it doesn't answer the question "how good a philosophy is neoreaction?" because it only makes a small dent. If you consider the FAQ in conjunction with Neoreactionary Philosophy in an Enormous, Planet-sized Nutshell, then you would get a better sense of the big picture of neoreaction, but he doesn't really integrate his arguments across the two essays, which causes an unfortunately misleading impression. The FAQ put me off getting into neoreaction for a while, but when I did, I was much more impressed than I expected. The only way to get a good sense of what it actually is would be spending a lot of time with it.
Things that need to happen before I take NRx any sort of seriously: 1. Someone hires an editor for Moldbug and publishes a readable and structured ebook Currently I have no idea if Moldbugs writings really answered Scott's objections and finding it out looks simply harder than what being a generic reader is supposed to be.
Well Michael Anissimov has just published an ebook. Disclaimer: I have not read it and thus cannot make any statements about it's contents.
And gets a bunch of the object level issues wrong, as Michael Anissimov has pointed out.
Fully agreed. Oops, accidentally retracted this and can't fix it.
Without getting into NRx issues, this sentence is very wrong.
Arguing and pursuing truth is indeed not the same, but when virtually every empirical, numerical claim is falsified by an opponent, that is a situation where arguing or changing the mind is really called for. To be fair, when they were making them I already smelled something. I have some familiarity with the history of conservative thought back to Oakeshott, Chesterton, Burke or Cicero and never just pointed to a crime stat or something and saying see, that is what is wrong here. It was never their strengths and I was half-expecting that engaging in chart duels is something they are not going to win.
"Taking in account history" means for neoreactionaries deconstrutivist techniques and not factual discussion for which evidence has to be presented. At least that's a position that Moldbug argued explicitely. When you look at the success of Moldbug predictions such as Bitcoin going to zero, you find that Moldbug is very bad at political understanding because he let's himself get blinded by stories.