EDIT: Thanks to people not wanting certain words google-associated with LW: Phyg
Lesswrong has the best signal/noise ratio I know of. This is great. This is why I come here. It's nice to talk about interesting rationality-related topics without people going off the rails about politics/fail philosophy/fail ethics/definitions/etc. This seems to be possible because a good number of us have read the lesswrong material (sequences, etc) which innoculate us against that kind of noise.
Of course Lesswrong is not perfect; there is still noise. Interestingly, most of it is from people who have not read some sequence and thereby make the default mistakes or don't address the community's best understanding of the topic. We are pretty good about downvoting and/or correcting posts that fail at the core sequences, which is good. However, there are other sequences, too, many of them critically important to not failing at metaethics/thinking about AI/etc.
I'm sure you can think of some examples of what I mean. People saying things that you thought were utterly dissolved in some post or sequence, but they don't address that, and no one really calls them out. I could dig up a bunch of quotes but I don't want to single anyone out or make this about any particular point, so I'm leaving it up to your imagination/memory.
It's actually kindof frustrating seeing people make these mistakes. You could say that if I think someone needs to be told about the existence of some sequence they should have read before posting, I ought to tell them, but that's actually not what I want to do with my time here. I want to spend my time reading and participating in informed discussion. A lot of us do end up engaging mistaken posts, but that lowers the quality of discussion here because so much time and space has been spent battling ignorance instead of advancing knowledge and dicussing real problems.
It's worse than just "oh here's some more junk I have to ignore or downvote", because the path of least resistance ends up being "ignore any discussion that contains contradictions of the lesswrong scriptures", which is obviously bad. There are people who have read the sequences and know the state of the arguments and still have some intelligent critique, but it's quite hard to tell the difference between that and someone explaining for the millionth time the problem with "but won't the AI know what's right better than humans?". So I just ignore it all and miss a lot of good stuff.
Right now, the only stuff I can be resonably guaranteed is intelligent, informed, and interesting is the promoted posts. Everything else is a minefield. I'd like there to be something similar for discussion/comments. Some way of knowing "these people I'm talking to know what they are talking about" without having to dig around in their user history or whatever. I'm not proposing a particular solution here, just saying I'd like there to be more high quality discussion between more properly sequenced LWers.
There is a lot of worry on this site about whether we are too exclusive or too phygish or too harsh in our expectation that people be well-read, which I think is misplaced. It is important that modern rationality have a welcoming public face and somewhere that people can discuss without having read three years worth of daily blog posts, but at the same time I find myself looking at the moderation policy of the old sl4 mailing list and thinking "damn, I wish we were more like that". A hard-ass moderator righteously wielding the banhammer against cruft is a good thing and I enjoy it where I find it. Perhaps these things (the public face and the exclusive discussion) should be separated?
I've recently seen someone saying that no-one complains about the signal/noise ratio on LW, and therefore we should relax a bit. I've also seen a good deal of complaints about our phygish exclusivity, the politics ban, the "talk to me when you read the sequences" attitude, and so on. I'd just like to say that I like these things, and I am complaining about the signal/noise ratio on LW.
Lest anyone get the idea that no-one thinks LW should be more phygish or more exclusive, let me hereby register that I for one would like us to all enforce a little more strongly that people read the sequences and even agree with them in a horrifying manner. You don't have to agree with me, but I'd just like to put out there as a matter of fact that there are some of us that would like a more exclusive LW.
I've lurked here for over a year and just started posting in the fan fic threads a month ago. I have read a handful of posts from the sequences and I believe that some of those are changing my life. Sometimes when I start a sequence post I find it uninteresting and I stop. Posts early in the recommended order do this, and that gets in the way every time I try to go through in order. I just can't be bothered because I'm here for leisure and reading uninteresting things isn't leisurely.
I am noise and I am part of the doom of your community. You have my sympathy, and also my unsolicited commentary:
Presently your community is doomed because you don't filter.
Noise will keep increasing until the community you value splinters, scatters, or relocates itself as a whole. A different community will replace it, resembling the community you value just enough to mock you.
If you intentionally segregate based on qualifications your community is doomed anyway.
The qualified will stop contributing to the unqualified sectors, will stop commending potential qualifiers as they approach qualification, and will stop driving out never qualifiers with disapproval. Noise will win as soon as something ... (read more)
I suspect communities have a natural life cycle and most are doomed. Either they change unrecognisably or they die. This is because the community members themselves change with time and change what they want, and what they want and will put up with from newbies, and so on. (I don't have a fully worked-out theory yet, but I can see the shape of it in my head. I'd be amazed if someone hasn't written it up.)
What this theory suggests: if the forum has a purpose beyond just existence (as this one does), then it needs to reproduce. The Center for Modern Rationality is just the start. Lots of people starting a rationality blog might help, for example. Other ideas?
(nods) OK, cool.
My working theory is that the original purpose of the OB blog posts that later became LW was to motivate Eliezer to write down a bunch of his ideas (aka "the Sequences") and get people to read them. LW continues to have remnants of that purpose, but less and less so with every passing generation.
Meanwhile, that original purpose has been transferred to the process of writing the book I'm told EY is working on. I'm not sure creating new online discussion forums solves a problem anyone has.
As that purpose gradually becomes attenuated beyond recognition, I expect that the LW forum itself will continue to exist, becoming to a greater and greater extent a site for discussion of HP:MoR, philosophy, cognition, self-help tips, and stuff its users think is cool that they can somehow label "rational." A small group of SI folks will continue to perform desultory maintenance, and perhaps even post on occasion. A small group of users will continue to discuss decision theory here, growing increasingly isolated from the community.
If/when EY gets HP:MoR nominated for a Hugo award, a huge wave of new users will appear, largely representative of science-fictio... (read more)
If anyone does feel motivated to post just bare links to sequence posts, hit one of the Harry Potter threads. These seem to be attracting LW n00bs, some of whom seem actually pretty smart - i.e., the story is working to its intended purpose.
I can understand people wanting that. If the goal is to spread this information, however, I'd suggest that those wanting to be part of an Inner Circle should go Darknet, invitation only, and keep these discussions there, if you must have them at all.
As someone who has been around here maybe six months and comes everyday, I have yet to drink enough kool aid not to find ridiculous elements to this discussion.
"We are not a Phyg! We are not a Phyg! How dare you use that word?" Could anything possibly make you look more like a Phyg than tabooing the word, and karmabombing people who just mention it? Well, the demand that anyone who shows up should read a million words in blog posts by one individual, and agree with most all of it before speaking does give "We are not... (read more)
I'm amused by the framing as a hypothetical. I'm far from being an old-timer, but I've been around for a while, and when I was new to this site a discussion like this was going on. I suspect the same is true for many of us. This particular discussion comes around on the gittar like clockwork.
In my case it left the impression that (a) this was an Internet forum like any other I've been on in the past seventeen years (b) like all of them, it behaved as though its problems were unique and special, rather than a completely generic phenomenon. So, pretty much as normal then.
BTW, to read the sequences is not to agree with every word of them, and when I read all the rest of the posts chronologically from 2009-2011 the main thing I got from it was the social lay of the land.
(My sociology is strictly amateur, though an ongoing personal interest.)
To LW's credit, "read the sequences" as a counterargument seems increasingly rare these days. I've seen it once in the last week or two, but considering that we're now dealing with an unusually large number of what I'll politely describe as contrarian newcomers, I'll still count that as a win.
In any case, I don't get the sense that this is an unknown issue. Calls for good introductory material come up fairly often, so clearly someone out there wants a better alternative to pointing newcomers at a half-million words of highly variable material and hoping for the best -- but even if successful, I suspect that'll be of limited value. The length of the corpus might contribute to accusations of phygism, but it's not what worries me about LW. Neither is the norm of relating posts to the Sequences.
This does give me pause, though: LW deals politely with intelligent criticism, but it rarely internalizes it. To the best of my recollection none of the major points of the Sequences have been repudiated, although in a work of that length we should expect some to have turned out to be demonstrably wrong; no one bats a thousand. A few seem to have slipped out of the de-facto canon... (read more)
Reply not with "read the sequences", but with "This is covered in [link to post], which is part of [link to sequence]." ? Use one of the n00b-infested Harry Potter threads, with plenty of wrong but not hopeless reasoning, as target practice.
[meta] A simple reminder: This discussion has a high potential to cause people to embrace and double down on an identity as part of the inner or outer circles. Let's try to combat that.
In line with the above, please be liberal with explanations as to why you think an opinion should be downvoted. Going through the thread and mass-downvoting every post you disagree with is not helpful. [/meta]
The post came across to me as an explicit call to such, which is rather stronger than "has a high potential".
I agree. Low barriers to entry (and utterly generic discussions, like on which movies to watch) seem to have lowered the quality. I often find myself skimming discussions for names I recognize, and just read their comments - ironic, given that once upon a time the anti-kibitzer seemed pressing!
Lest this been seen as unwarranted arrogance: there are many values of p in [0,1] such that I would run a p risk of getting personally banned in return for removing the bottom p of the comments. I often write out a comment and delete it, because I think that, while above the standard of the adjacent comments, it is below what I think the minimal bar should be. Merely saying new, true things about the topic matter is not enough!
The Sequence Re-Runs seem to have had little participation, which is disappointing - I had great hope for those.
As someone who is rereading the sequences I think I have a data point as to why. First of all, the "one post a day" is very difficult for me to do. I don't have time to digest a LW post every day, especially if I've got an exam coming up or something. Secondly, I joined the site after the effort started, so I would have had to catch up anyway. Thirdly, ideally I'd like to read at a faster average rate than one per day. But this hasn't happened at all, my rate has actually been rather slower, which is kind of depressing.
Edit: Eliminated text to conform to silly new norm. Check out relevant image macro.
It's whimsical, I like it. The purported SEO rationale behind it is completely laughable (really, folks? People are going to judge the degree of phyggishness of LW by googling LW and phyg together, and you're going to stand up and fight that? That's just insane), but it's cute and harmless, so why not adopt it for a few days? Of all reasons to suspect LW of phyggish behavior, this has got to be the least important one. If using the word "phyg" clinches it for someone, I wouldn't take them seriously.
Why in the name of the mighty Cthulhu should people on LW read the sequences? To avoid discussing the same things again and again, so that we can move to the next step. Minus the discussion about definitions of the word phyg, what exactly are talking about?
When a tree falls down in a LessWrong forest, why there is a "sound":
Because people on LW are weird. Instead of discussing natural and sane topics, such as cute kittens, iPhone prices, politics, horoscopes, celebrities, sex, et cetera, they talk abour crazy stuff like thinking machines and microscopic particles. Someone should do them a favor, turn off their computers, and buy them a few beers, so that normal people can stop being afraid of them.
Because LW is trying to change the way people think, and that is scary. Things like that are OK only when the school system is doing it, because the school system is accepted by the majority. Books are usually also accepted, but only if you borrow them from a public library.
Because people on LW pretend they know some things better that everyone else, and that's an open challenge that someone should go and kick their butts, preferably literally. Only strong or popular people are ... (read more)
No, that isn't it. LW isn't at all special in that respect - a huge number of specialized communities exist on the net which talk about "crazy stuff", but no one suspects them of being phygs. Your self-deprecating description is a sort of applause lights for LW that's not really warranted.
No, that isn't it. Every self-help book (of which there's a huge industry, and most of which are complete crap) is "trying to change the way people think", and nobody sees that as weird. The Khan academy is challenging the scho... (read more)
It's not the Googleability of "phyg". One recent real-life example is a programmer who emailed me deeply concerned (because I wrote large chunks of the RW article on LW). They were seriously worried about LessWrong's potential for decompartmentalising really bad ideas, given the strong local support for complete decompartmentalisation, by this detailed exploration of how to destroy semiconductor manufacture to head off the uFAI. I had to reassure them that Gwern really is not a crazy person and had no intention of sabotaging Intel worldwide, but was just exploring the consequences of local ideas. (I'm not sure this succeeded in reassuring them.)
But, y'know, if you don't want people to worry you might go crazy-nerd dangerous, then not writing up plans for ideology-motivated terrorist assaults on the semiconductor industry strikes me as a good start.
Edit: Technically just sabotage, not "terrorism" per se. Not that that would assuage qualms non-negligibly.
On your last point, I have to cite our all-*cough*-wise Professor Quirrell
Yeah, but he didn't do it right there in that essay. And saying "AI is dangerous, stopping Moore's Law might help, here's how fragile semiconductor manufacture is, just saying" still read to someone (including several commenters on the post itself) as bloody obviously implying terrorism.
You're pointing out it doesn't technically say that, but multiple people coming to that essay have taken it that way. You can say "ha! They're wrong", but I nevertheless submit that if PR is a consideration, the essay strikes me as unlikely to be outweighed by using rot13 for SEO.
"Just saying" is really not a disclaimer at all. c.f. publishing lists of abortion doctors and saying you didn't intend lunatics to kill them - if you say "we were just saying", the courts say "no you really weren't."
We don't have a demonstrated lunatic hazard on LW (though we have had unstable people severely traumatised by discussions and their implications, e.g. Roko's Forbidden Thread), but "just saying" in this manner still brings past dangerous behaviour along these lines to mind; and, given that decompartmentalising toxic waste is a known nerd hazard, this may not even be an unreasonable worry.
It's a name for the style of argument: that it's not advocating people do these things, it's just saying that uFAI is a problem, slowing Moore's Law might help and by the way here's the vulnerabilities of Intel's setup. Reasonable people assume that 2 and 2 can in fact be added to make 4, even if 4 is not mentioned in the original. This is a really simple and obvious point.
Note that I am not intending to claim that the implication was Gwern's original intention (as I note way up there, I don't think it is); I'm saying it's a property of the text as rendered. And that me saying it's a property of the text is supported by multiple people adding 2 and 2 for this result, even if arguably they're adding 2 and 2 and getting 666.
There's a pattern that shows up in some ethics discussions where it is argued that an action that you could actually go out and start doing (so no 3^^^3 dust specs or pushing fat people in front of runaway trains) that diverges from everyday social conventions is a good idea. I get the sense from some people that they feel obliged to either dismiss the idea by any means, or start doing the inconvenient but convincingly argued thing right away. And they seem to consider dismissing the idea with bad argumentation a lesser sin than conceding a point or suspending judgment and then continuing to not practice whatever the argument suggested. This shows up often in discussions of vegetarianism.
I got the idea that XiXiDu was going crazy because he didn't see any options beyond dedicating his life to door-to-door singularity advocacy or finding the fatal flaw which proved once and for all that SI are a bunch of deluded charlatans, and he didn't want to do the former just because a philosophical argument told him to and couldn't quite manage the latter.
If this is an actual thing, people with this behavior pattern would probably freak out if presented with an argument for terrorism they weren't able to dismiss as obviously flawed extremely quickly.
Thanks for comments. What I wrote was exaggerated, written under strong emotions, when I realized that the whole phyg discussion does not make sense, because there is no real harm, only some people made nervous by some pattern matching. So I tried to list the patterns which match... and then those which don't.
My assumption is that there are three factors which together make the bad impression; separately they are less harmful. Being only "weird" is pretty normal. Being "weird + thorough", for example memorizing all Star Trek episodes, is more disturbing, but it only seems to harm the given individual. Majority will make fun of such individuals, they are seen as at the bottom of pecking order, and they kind of accept it.
The third factor is when someone refuses to accept the position at the bottom. It is the difference between saying "yeah, we read sci-fi about parallel universes, and we know it's not real, ha-ha silly us" and saying "actually, our intepretation of quantum physics is right, and you are wrong, that's the fact, no excuses". This is the part that makes people angry. You are allowed to take the position of authority only if you are... (read more)
That nobody sees self-help books as weird or cultlike.
I think your post is troubling in a couple of ways.
First, I think you draw too much of a dichotomy between "read sequences" and "not read sequences". I have no idea what the true percentage of active LW members is, but I suspect a number of people, particularly new members, are in the process of reading the sequences, like I am. And that's a pretty large task - especially if you're in school, trying to work a demanding job, etc. I don't wish to speak for you, since you're not clear on the matter, but are people in the process of reading the sequences noise? I'm only in QM, and certainly wasn't there when I started posting, but I've gotten over 1000 karma (all of it on comments or discussion level posts). I'd like to think I've added something to the community.
Secondly, I feel like entrance barriers are pretty damn high already. I touched on this in my other comment, but I didn't want to make all of these points in that thread, since they were off topic to the original. When I was a lurker, the biggest barrier to me saying hi was a tremendous fear of being downvoted. (A re-reading of this thread seems prudent in light of this discussion) I'd never been part of a... (read more)
It was this thread.
Basically it boiled down to this: I was suggesting that one reason some people might donate to more than one charity is that they're risk averse and want to make sure they're doing some good, instead of trying to help and unluckily choosing an unpredictably bad charity. It was admittedly a pretty pedantic point, but someone apparently didn't like it.
My $0.02 (apologies if it's already been said; I haven't read all the comments): wanting to do Internet-based outreach and get new people participating is kind of at odds with wanting to create an specialized advanced-topics forum where we're not constantly rehashing introductory topics. They're both fine goals, but trying to do both at once doesn't work well.
LW as it is currently set up seems better optimized for outreach than for being an advanced-topics forum. At the same time, LW doesn't want to devolve to the least common denominator of the Internet. This creates tension. I'm about .6 confident that tension is intentional.
Of course, nothing stops any of us from creating invitation-only fora to which only the folks whose contributions we enjoy are invited. To be honest, I've always assumed that there exist a variety of more LW-spinoff private forums where the folks who have more specialized/advanced groundings get to interact without being bothered by the rest of us.
Somewhat relatedly, one feature I miss from the bad old usenet days is kill files. I suspect that I would value LW more if I had the ability to conceal-by-default comments by certain users here. Concealing sufficiently downvoted comments is similar in principle, but not reliable in practice.
My LessWrong Power Reader has a feature that allows you to mark authors as liked/disliked, which helps to determine which comments are expanded vs collapsed. Right now the weights are set so that if you've disliked an author, then any comment written by him or her that has 0 points or less, along with any descendants of that comment, will be collapsed by default. Each comment in the collapsed thread still has a visible header with author and points and color-coding to help you determine whether you still want to check it out.
There's probably no need for the groups to signal each other's existence.
When a new Secret Even Less Wrong is formed, members are previously formed Secret Even Less Wrongs who are still participating in Less Wrong are likely to receive secret invites to the new Secret Even Less Wrong.
Nyan_sandwich might set up his secret Google Group or whatever, invite the people he feels are worthy and willing to form the core of his own Secret Even Less Wrong, and receive in reply an invite to an existing Secret Even Less Wrong.
That might have already happened!
Let's be explicit here - your suggestion is that people like me should not be here. I'm a lawyer, and my mathematics education ended at Intro to Statistics and Advanced Theoretical Calculus. I'm interested in the cognitive bias and empiricism stuff (raising the sanity line), not AI. I've read most of the core posts of LW, but haven't gone through most of the sequences in any rigorous way (i.e. read them in order).
I agree that there seem to be a number of low quality posts in discussion recently (In particular, Rationally Irrational should not be in Main). But people willing to ignore the local social norms will ignore them however we choose to enforce them. By contrast, I've had several ideas for posts (in Discussion) that I don't post, but I don't think it meets the community's expected quality standard.
Raising the standard for membership in the community will exclude me or people like me. That will improve the quality of technical discussion, at the cost of the "raising the sanity line" mission. That's not what I want.
No martyrs allowed.
I don't propose simply disallowing people who havn't read everything from being taken seriously, if they don't say anything stupid. It's fine if you havn't read the sequences and don't care about AI or heavy philosophy stuff, I just don't want to read dumb posts about those topics that come from someone having not read the stuff.
As a matter of fact, I was careful to not propose much of anything. Don't confuse "here's a problem that I would like solved" with "I endorse this stupid solution that you don't like".
I, for one, would like to see discussion of LW topics from the perspective of someone knowledgeable about the history of law; after all law is humanity's main attempt to formalize morality, so I would expect some overlap with FAI.
I don't mind people who haven't read the sequences, as long as they don't start spouting garbage that's already been discussed to death and act all huffy when we tell them so; common failure modes are "Here's an obvious solution to the whole FAI problem!", "Morality all boils down to X", and "You people are a cult, you need to listen to a brave outsider who's willing to go against the herd like me".
I think the barrier of entry is high enough - the signal-to-noise ratio is high, and if you only read high-karma posts and comments you are guaranteed to get substance.
As for forcing people to read the entire Sequences, I'd say rationalwiki's critique is very appropriate (below). I myself have only read ~20% of the Sequences, and by focusing on the core sequences and highlighted articles, have recognized all the ideas/techniques people refer to in the main-page and discussion posts.
You should try reading the other 80% of the sequences.
As far as I can tell (low votes, some in the negative, few comments), the QM sequence is the least read of the sequences, and yet makes a lot of EY's key points used later on identity and decision theory. So most LW readers seem not to have read it.
Suggestion: a straw poll on who's read which sequences.
I've seen enough of the QM sequence and know enough QM to see that Eliezer stopped learning quantum mechanics before getting to density matrices. As a result, the conclusions he draws from QM rely on metaphysical assumptions and seem rather arbitrary if one knows more quantum mechanics. In the comments to this post Scott Aaronson tries to explain this to Eliezer without much success.
The negative comments from physicists and physics students are sort of a worry (to me as someone who got up to the start of studying this stuff in second-year engineering physics and can't remember one dot of it). Perhaps it could do with a robustified rewrite, if anyone sufficiently knowledgeable can be bothered.
The Quantum Physics Sequence is unusual in that normally, if someone writes 100,000(?) words explaining quantum mechanics for a general audience, they genuinely know the subject first: they have a physics degree, they have had an independent reason to perform a few quantum-mechanical calculations, something like that. It seems to me that Eliezer first got his ideas about quantum mechanics from Penrose's Emperor's New Mind, and then amended his views by adopting many-worlds, which was probably favored among people on the Extropians mailing list in the late 1990s. This would have been supplemented by some incidental study of textbooks, Feynman lectures, expository web pages... but nonetheless, that appears to be the extent of it. The progression from Penrose to Everett would explain why he presents the main interpretive choice as between wavefunction realism with objective collapse, and wavefunction realism with no collapse. His prose is qualitative just about everywhere, indicating that he has studied quantum mechanics just enough to satisfy himself that he has obtained a conceptual understanding, but not to the point of quantitative competence. And then he has undertaken to convey ... (read more)
Excellent idea - done. Thank you!
Yes! I try to get people to read the "sequences" in ebook form, where they are presented in simple chronological order. And the title is "Eliezer Yudkowsky, blog posts 2006-2010".
Working on it.
In all seriousness though, I often find the Sequences pretty cumbersome and roundabout. Eliezer assumes a pretty large inferential gap for each new concept, and a lot of the time the main point of an article would only need a sentence or two for it to click for me. Obviously this makes it more accessible for concepts that people are unfamiliar with, but right now it's a turn-off and definitely is a body of work that will be greatly helped by being compressed into a book.
Tetronian started the article, so it's his fault actually, even if he's pretty much moved here.
I have noted before that taking something seriously because it pays attention to you is not in fact a good idea. Every second that LW pays a blind bit of notice to RW is a second wasted.
See also this comment on the effects of lack of outside world feedback, and a comparison to Wikipedia (which basically didn't get any outside attention for four or five years and is now part of the infrastructure of society, at which I still boggle).
And LW may or may not be pleased that even on RW, when someone fails logic really badly the response is often couched in LW terms. So, memetic infections ahoy! Think of RW as part of the Unpleasable Fanbase.
I disagree with the grandparent. I read the book a while ago having already read most of the Sequences -- I think that the book gives a fairly good overview of heuristics and biases but doesn't do as good of a job in turning the information into helpful intuitions. I think that the Sequences cover most (but not quite all) of what's covered in the book, while the reverse is not true.
Lukeprog reviewed the book here: his estimate is that it contains about 30% of the Core Sequences.
From Shirky's Essay on online groups: "The Wikipedia right now, the group collaborated online encyclopedia is the most interesting conversational artifact I know of, where product is a result of process. Rather than "We're specifically going to get together and create this presentation" it's just "What's left is a record of what we said."
When somebody goes to a wiki, they are not oging there to discuss elementary questions that have already been answered; they are going there to read the results of that discussion. Isn't this basically what the OP wants?
Why aren't we using the wiki more? We have two modes of discussion here: discussion board, and wiki. The wiki serves more as an archive of the posts that make it to main-page level, meaning that all the hard work of the commenters in the discussion boards is often lost to the winds of time. (Yes, some people have exceptionally good memory and link back to them. But this is obviously not sustainable.) If somebody has a visionary idea on how to lubricate the process of collating high-quality comments and incorporating them into a wiki-like entity, then I suspect our problem could be solved.
This is a really good question.
I don't use the wiki because me LW acount is not valid there. You need to make a seperate acocunt for the wiki.
That seems like an utterly stupid reason in retrospect, but I imagine that's a big reason why no one is wikiing.
So its a trivial inconvenience?
The best way to become more exclusive while not giving the impression of a cult, or by banning people, is by raising your standards and being more technical. As exemplified by all the math communities like the n-Category Café or various computer science blogs (or most of all technical posts of lesswrong).
Stop using that word.
I want to keep the use of the word, but to hide it from google I have replaced it with it's rot13: phyg
And now we can all relax and have a truly uninhibited debate about whether LW is a phyg. Who would have guessed that rot13 has SEO applications?
Well, yes and no.
I feel pain just looking at that sentence.
I sure as hell hope self-censorship or encryption for the sake of google results isn't going to become the expected norm here. It's embarassingly silly, and, paradoxically, likely to provide ammunition for anyone who might want to say that we are this thing-that-apparently-must-not-be-named. Wouldn't be overly suprrised if these guys ended up mocking it.
The original title of the post had a nice impact, the point of the rhetorical technique used was to boldly use a negatively connotated word. Now it looks weird and anything but bold.
Also, reading the same rot13-ed word multiple times caused me to learn a small portion of rot13 despite my not wanting to. Annoying.
The only word that shouldn't be used for reasons that extend to not even identifying it. (google makes no use/mention distinction).
Reading the comments, it feels like the biggest concern is not chasing away the initiates to our phyg. Perhaps tiered sections, where demonstrable knowledge in the last section gains you access to higher levels of signal to noise ratio? Certainly would make our phyg resemble another well known phyg.
Maybe we should charge thousands of dollars for access to the sequences as well? And hire some lawyers...
More seriously, I wonder what people's reaction would be to a newbie section that wouldn't be as harsh as the now-much-harsher normal discussion. This seems to go over well on the rest of the internet.
Sort of like raising the price and then having a sale...
What you want is an exclusive club. Not a cult or phyg or whatever.
I personally come to Less Wrong specifically for the debates (well, that, and HP:MoR Wild Mass Guessing). Therefore, raising the barrier to entry would be exactly the opposite of what I want, since it would eliminate many fresh voices, and limit the conversation to those who'd already read all of the sequences (a category that would exclude myself, now that I think about it), and agree with everything said therein. You can quibble about whether such a community would constitute a "phyg" or not, but it definitely wouldn't be a place where any prod... (read more)
I don't see why having the debate at a higher level of knowledge would be a bad thing. Just because everyone is familar with a large bit of useful common knowledge doesn't mean no-one disagrees with it, or that there is nothing left to talk about. There are some LW people who have read everything and bring up interesting critiques.
Imagine watching a debate between some uneducated folks about whether a tree falling in a forest makes a sound or not. Not very interesting. Having read the sequences it's the same sort of boring as someone explaining for the millionth time that "no, technological progress or happyness is not a sufficient goal to produce a valuable future, and yes, an AI coded with that goal would kill us all, and it would suck".
The point of my post was that that is not an acceptable solution.
A 'debate club' mindset is one of the things I would try to avoid. Debates emerge when there are new ideas to be expressed and new outlooks or bodies of knowledge to consider - and the supply of such is practically endless. You don't go around trying to artificially encourage an environment of ignorance just so some people are sufficiently uninformed that they will try to argue trivial matters. That's both counterproductive and distasteful.
I would not be at all disappointed if a side effect of maintaining high standards of communication causes us to lose some participants who "come to Less Wrong specifically for the debates". Frankly, that would be among the best things we could hope for. That sort of mindset is outright toxic to conversations and often similarly deleterious to the social atmosphere.
I agree pretty much completely and I think if you're interested in Less Wrong-style rationality, you should either read and understand the sequences (yes, all of them), or go somewhere else. Edit, after many replies: This claim is too strong. I should have said instead that people should at least be making an effort to read and understand the sequences if they wish to comment here, not that everyone should read the whole volume before making a single comment.
There are those who think rationality needs to be learned through osmosis or whatever. That... (read more)
This is a pretty hardcore assertion.
I am thinking of lukeprog's and Yvain's stuff as counterexamples.
I think of them (and certain others) as exceptions that prove the rule. If you take away the foundation of the sequences and the small number of awesome people (most of whom, mind you, came here because of Eliezer's sequences), you end up with a place that's indistinguishable from the programmer/atheist/transhumanist/etc. crowd, which is bad if LW is supposed to be making more than nominal progress over time.
Standard disclaimer edit because I have to: The exceptions don't prove the rule in the sense of providing evidence for the rule (indeed, they are technically evidence contrariwise), but they do allow you to notice it. This is what the phrase really means.
Um, after I read the sequences I ploughed through every LW post from the start of LW to late 2010 (when I started reading regularly). What I saw was that the sequences were revered, but most of the new and interesting stuff from that intervening couple of years was ignored. (Though it's probably just me.)
At this point A Group Is Its Own Worst Enemy is apposite. Note the description of the fundamentalist smackdown as a stage communities go through. Note it also usually fails when it turns out the oldtimers have differing and incompatible ideas on what the implicit constitution actually was in the good old days.
tl;dr declarations of fundamentalism heuristically strike me as inherently problematic.
edit: So what about this comment rated a downvote?
edit 2: ah - the link to the Shirky essay appears to be giving the essay in the UK, but Viagra spam in the US o_0 I've put a copy up here.
Exchanges the look two people give each other when they each hope that the other will do something that they both want done but which neither of them wants to do.
Evaporative cooling is change to average belief from old members leaving.
Your article is about change to average belief from new members joining.
I don't consider myself a particularly patient person when it comes to tolerating ignorance or stupidity but even so I don't much mind if people here contribute without having done much background reading. What matters is that they don't behave like an obnoxious prat about it and are interested in learning things.
I do support enforcing high standards of discussion. People who come here straight from their highschool debate club and Introduction to Philosophy 101 and start throwing around sub-lesswrong-standard rhetoric should be downvoted. Likewise for confident declarations of trivially false things. There should be more correction of errors that would probably be accepted (or even rewarded) in many other contexts. These are the kind of thing that don't actively exclude but do have the side effect of raising the barrier to entry. A necessary sacrifice.
Isn't the metaethics sequence not liked very much? I haven't read it in a while, and so I'm not sure that I actually read all of the posts, but I found what I read fairly squishy, and not even on the level of, say, Nietzsche's moral thought.
Downvoting people for not understanding that beliefs constrain expectation I'm okay with. Downvoting people for not agreeing with EY's moral intuitions seems... mistaken.
Beliefs are only sometimes about anticipation. LessWrong repeatedly makes huge errors when they interpret "belief" in such a naive fashion;—giving LessWrong a semi-Bayesian justification for this collective failure of hermeneutics is unwise. Maybe beliefs "should" be about anticipation, but LessWrong, like everybody else, can't reliably separate descriptive and normative claims, which is exactly why this "beliefs constrain anticipation" thing is misleading. ...There's a neat level-crossing thingy in there.
EY thinking of meta-ethics as a "solved problem" is one of the most obvious signs that he's very spotty when it comes to philosophy and can't really be trusted to do AI theory.
(Apologies if I come across as curmudgeonly.)
Part of my concern about Eliezer trying to build FAI also stems from his treatment of metaethics. Here's a caricature of how his solution looks to me:
Alice: Hey, what is the value of X?
Bob: Hmm, I don't know. Actually I'm not even sure what it means to answer that question. What's the definition of X?
Alice: I don't know how to define it either.
Bob: Ok... I don't know how to answer your question, but what if we simulate a bunch of really smart people and ask them what the value of X is?
Alice: Great idea! But what about the definition of X? I feel like we ought to be able to at least answer that now...
Bob: Oh that's easy. Let's just define it as the output of that computation I just mentioned.
BTW, I've had numerous "wow" moments with philosophical insights, some of which made me spend years considering their implications. For example:
I expect that a correct solution to metaethics would produce a similar "wow" reaction. That is, it would be obvious in retrospect, but in an overwhelming instead of underwhelming way.
Hm. I think I'll put on my project list "reread the metaethics sequence and create an intelligent reply." If that happens, it'll be at least two months out.
Has it ever been demonstrated that there is a consensus on what point he was trying to make, and that he in fact demonstrated it?
He seems to make a conclusion, but I don't believe demonstrated it, and I never got the sense that he carried the day in the peanut gallery.
Well, for starters determining whether something is a preference or a bias is rather arbitrary in practice.
Expecting your interlocutors to have a passing familiarity with the subject under discussion is not a logical fallacy.
I think you want it more tiered/topic'ed, not more exclusive, which I would certainly support. Unfortunately, the site design is not a priority.
I've seen this suggested before, and while it would have positive aspects, from a PR perspective, it would be an utter nightmare. I've been here for slightly less than a year, after being referred to HPMOR. I am very unlikely (Prior p = 0.02, given that EY started it and I was obsessed with HPMOR, probably closer to p = 0.07) to have ever followed a forum/blog that had an "exclusive members" section. Insomuch as LW is interested in recruiting potential rationalists, this is a horrible, horrible idea.
I know you say that you don't want to end up with "ignore any discussion that contains contradictions of the lesswrong scriptures", but it sounds a bit like that. (In particular, referring to stuff like "properly sequenced LWers" suggests to me that you not only think that the sequences are interesting, but actually right about everything). The sequences are not scripture, and I think (hope!) there are a lot of LWers who disagree to a greater or lesser degree with them.
For example, I think the metaethics sequence is pretty hopeless (WA... (read more)
What if users were expected to have a passing familiarity with the topics the sequences covered, but not necessarily to have read them? That way, if they were going to post about one of the topics covered in the sequences, they could be sure to brush up on the state of the debate first.
I would guess that hanging out with friends who are aspiring rationalists is a faster way to become rational than reading the sequences.
In any case, it seems pretty clear to me that the sequences do not have a monopoly on rationality. Eliezer isn't the only person in the world who's good at thinking about his thinking.
FWIW, I was thinking along the lines of only requesting passing familiarity with non-core sequences.
I haven't read most of the sequences yet and agree with most of what those lw members are saying of who you'd like to see more of.
Most of the criticisms I voice are actually rephrased and forwarded arguments and ideas from people much smarter and more impressive than me. Including big names like Doug... (read more)
Um, you seem to me to be saying that someone (davidad) who is in fact familiar with the sequences, and who left AI to achieve things well past most of LW's participants, are a perfect example of who you don't want here. Is that really what you meant to put across?
When EY are writing the sequences what percentage of population he was hoping to influence? I suppose a lot. Then now some people are bothered because the message began to spread and in the meantime the quality of posts are not the same. Well, if the discussion become poor, go to another places. High technical guys simple don't get involve in something they see is hopeless or not interesting, like trying to turn people more rational or reduce x-risks.
First they came for the professional philosophers,
and I didn't speak out because I wasn't a professional philosopher.
Then they came for the frequentists,
and I didn't speak out because I wasn't a frequentist.
Then they came for the AI skeptics,
and I didn't speak out because I wasn't skeptical of AI.
and then there was no one left to talk to.
"These guys are cultish and they know it, as evidenced by the fact that they're censoring the word 'cult' on their site"