There seems to be some folks who might derive useful insights from a third-party, and mostly neutral, perspective of how the community appears after an honest and sustained effort of engagement, someone who doesn't really place AI risk as their top priority but who also doesn't completely ignore it like some critics, or opponents, of LW might. 

Notably I've encountered some folks who had strong personal opinions one way or another but refrained from writing them in a public, or even pseudo-anonymous, manner. 

There also appears to be a large group of lurkers or once-in-a-blue-moon posters who nonetheless have some views of the community and might benefit from someone willing to take the risk to do a write-up.

 

First off, addressing the popular critiques and praises:

There has definitely been some evaporative cooling of the community in the past decade or so. Some of the most insightful members have gone on to do bigger things, and the average quality of new posts is somewhat less than where it was a decade ago. Or so far as I can tell via the archives. This isn't very surprising as this is the common trajectory of every community that rapidly grows in size.

It would have taken a super-human effort to retain the same level of quality going from 100 to 1000 users, let alone from 1000 to 10000, and so on. So I don't think that would have been a fair expectation to place on the moderators, or anyone else involved, of a decade ago.

On the flip side, there is a larger cross-section of society represented in the 2022 userbase, And there has been a correspondent softening of the hard edges that may have been off putting to some a decade ago.

Relatedly, the proportion of really bizarre or challenging writing has gone down, for better and for worse. For example, there definitely does appear to be some unique benefit from a community with lots of oddballs with jarring writing styles, but the downside is obvious because nobody really desires to have their norms be constantly challenged in every paragraph.

 

There has been growing focus, and emphasis, on advancements in AI risk and alignment, so LW does seem to be less of a catch-all forum than before. This is clearly better for those who wish to focus and really get into the details. But I can sympathize with the critique that there's less charm compared to a group of unconstrained folks exploring in a hundred different directions.

Some of the common terminology is indeed puzzlingly unique, and thus promotes a distinctive writing style that does, in extreme cases, seem to be satire compared to the academic norm. I found it personally difficult to adjust to, and as you can tell by my somewhat varying writing style, I haven't found the best way to write in a conversational yet concise manner while incorporating the terminology.

There is some charm, and exciting challenge, in trying to craft writing that isn't dry and aloof yet still remains accurate enough to describe highly complex and technical ideas. So many of the critics seem to be missing the forest for the trees.

And I can see very little direct harm in being an outlier in writing style, and quite a lot of benefit from having such a unique differentiator.

Certainly some the best Fanfiction I've ever read came from community members, and I highly doubt they would be nearly as popular if standard academic jargon were used.

 

That being said, the critics do have a reasonable argument when it comes to comparisons with other notable online forums. LW does seem slightly more insular in some respects than SSC, Overcoming Bias, HN, etc., though that is understandable given the unique origins of the community and the developments that have taken place since.

Also, I do agree that there is a slight 'creepiness vibe' with the way the Sequences are presented by default on the home page, as well as the way they are written. Of course the necessities of trying to present very abstract concepts in an educational manner to a target demographic that is less experienced and outside of the 99.9th percentile that typically interact with such ideas in elite graduate programs and comparable institutions, really do constrain the possible range of writing styles.

For example, references to popular culture and mass media that would seem too immature for a serious academic journal are almost to be expected in casual online essays. Likewise with the more emotive and conversational writing style. Along with the very understandable desire to not alienate potential peripheral demographics, I can't see a much better way of writing the Sequences even with a decade of hindsight, without drastically narrowing the potential audience.

 

There is also commonly made criticism of 'hero worship' in regards to Eliezer which I've found to exist but it's somewhat overstated. On average he does seem to receive more deferential treatment than other long standing community members, and it does appear to be more than a strict linear increase, as his real accomplishments do seem to be on paper somewhat less than Robin Hanson for example. 

I've not met either personally so I will refrain from judging on their personal qualities. 

But I can also empathize with many of the newer, and younger, members who likely have never had 1 on 1 interactions with someone relatively well known, in person or online, and who understandably lack the context to place themselves. 

I have been quite fortunate to have met those who were truly far superior in some capacity than myself, and who seem to be several levels above anyone on LW, so I know it's not realistic to expect everyone to have had the same opportunity.

The information environment is also more adversarial nowadays than what it was before, so the apparent deficiencies of LW need to be adjusted to the new environmental baseline to allow for a fair comparison. None of the critics, or opponents, seems to have done this, though this is almost never done for any type of criticism. Perhaps that says more about criticism, and critics themselves, than what they critique. 

So it may very well be that, relative to its environment, 2022 LW is actually superior to that of 2012 LW. Especially as the moderators have done a good job with maintaining the publicly accessible focus.

 

Overall, my experience has been positive, there are still interesting writings, the work being done clearly has some non-zero value, and the community norms are certainly far better than most of the internet. 

In practical terms I've found the average quality of writings on LW to match the best subreddits but worse than the best private forums. 

(For reference I keep a tally of essays and although none of the writings here make it into my top 10 list of the most persuasive essays I've ever read, there is a startlingly high proportion in my top 1000.)

Whether or not average quality can be maintained will likely be a function of growth rate, overall size, and moderator involvement. It's here that it would be customary to offer some advice on what to do for the future but there already are many competent folks who have some interest in maintaining community quality, and who have enough experience to understand the common pitfalls of online interaction.

In the end I do hope the community will keep on going, in some form, into the distant future.

New Comment
40 comments, sorted by Click to highlight new comments since: Today at 2:48 PM

Given that LW was almost dead a few years back, and is now a vibrant community against the odds, without losing its uniqueness, including Eliezer and sometimes Scott Alexander posting again, kudos go to the LW 2.0 team (and to the GW mirror).

Interestingly, I didn't perceive such a period when going through the archives. Perhaps there were still enough memorable writings to obscure any appearance of dormancy. That may also say something about how much of what we read is truly remembered with clarity.

LW does seem slightly more insular in some respects than SSC, Overcoming Bias, HN, etc., though that is understandable given the unique origins of the community and the developments that have taken place since.

I think other groups (e.g., EAs / the EA Forum) should try to be emulate LW and become more insular, in at least a few respects:

  • Be more willing to develop your own culture, and enforce your own norms, that are very different from the culture and norms of society-at-large. Do more things based on what makes sense, rather than what's conventional.
  • Care less about your reputation in the eyes of people outside the community. (E.g., I think a lot of what's good about LW comes from the fact that we're more trying to impress each other, and being at least somewhat deliberate about which things we want to count as 'impressive', rather than parroting whatever's conventional in some non-LW group that we want to win over.)
  • Be more willing to disagree with with a mainstream view (once you think you understand the view and have a counter-argument you think is strong), and to build edifices of knowledge that are predicated on 'mainstream view X is false'.
    • This doesn't mean 'ignore all arguments from people outside the community'; but it means something like 'use those arguments more like a convenient tool for actually learning things, testing things, and narrowing your uncertainty', which is a different mindset than 'listen to outsiders as a sign of respect / as a way to signal virtuous humility / as a way to signal virtuous deference / etc.'.

Thanks for raising some valuable points. I agree there is a strong argument to be made for insularity of norms and culture to a certain degree as it does seem highly unlikely human flourishing will be most optimally sustained if every group on Earth converged on the exact same set.

Some of the apparent deficiencies of LW may turn out to be strengths when viewed from such a perspective. Though that raises the interesting question of whether some apparent strengths may turn into deficiencies...

Though that raises the interesting question of whether some apparent strengths may turn into deficiencies...

I, for one, am interested in how the rest of this sentence goes.

For example, having a broader cross section of society in the community may be detrimental to the goal of maintaining that sufficient degree of insularity. Whereas the slight 'creepiness vibe' may in fact be beneficial for achieving it.

For me, as someone who's lurked off and on for a few years and only started regularly commenting recently, I find this whole place terribly intimidating. Everyone else is far smarter than me and I am used to being the smartest person in every room, and it's quite painful and makes it hard to interact. It's some of the best writing I've ever read on the internet... but that may be a bad thing, as it's an impossible threshold to climb over as a newbie.

You might try using the shortform feature to get some ideas out in a less intense setting. 

I used to be the smartest person in the room and it felt lonely. Also, it felt like some problems are never going to be fixed, because I am unable to fix them, and other people are often unable to even see them.

The content on LW is not graded on a curve. There is some noise in the karma feedback, but generally, as long as your writing avoids certain mistakes, the response is neutral or positive.

I'm usually the dumbest person in the room and I still post occasionally.

Hmm. Well, this could imply you're smarter than me, as a truly smart person would surround themself with other people smarter still :P

In addition to Raemon's suggestion, I would add that keeping a daily, personal, diary, if you haven't already, is vital to the developmental process in understanding how to best organize your own thoughts. Or so I've observed.

Been doing that for years. My thoughts are still ill-organized. It's something about how my brain works. Attention deficit plus a dearth of memory = I don't have a bird's eye view of my own mind, and I rely on intuition (trained on details I can no longer remember) to tell me what to think, after which I have to try to figure out why I feel that xyz thing is true (that is, what experiences in the past trained that intuition) before I can determine whether to trust it or not. Almost none of my cognition is rooted in any kind of analytical reasoning. I just have to make it look like it is in order to communicate with others.

If you constantly re-read past entries and try to improve on clarity, conciseness, and perceptive depth with every new entry I think you will eventually improve your writing skills. It just takes effort and persistence to do so, even on the days when self-reflection appear unbearably painful.

Curious what essays make your top 10 list?

Here's a few, unordered:

As We May Think, by Vannevar Bush

Politics and the English Language, by George Orwell

The Tyranny of Structurelessness, by Jo Freeman

Some Moral and Technical Consequences of Automation, by Norbert Wiener

Can we Survive technology?, by John von Neumann (though I may be a bit biased here as I've had personal interaction with one of his family members)

Oh, that makes sense. Yepp, if you're talking about essays from throughout history then breaking the top 10 does seem like a high bar.

Though I think, for me, that the following probably make it in (especially when I weight more heavily on usefulness to me, rather than prescience): https://www.lesswrong.com/posts/yA4gF5KrboK2m2Xu7/how-an-algorithm-feels-from-inside https://www.lesswrong.com/s/SGB7Y5WERh4skwtnb/p/FaJaCgqBKphrDzDSj https://slatestarcodex.com/2018/01/24/conflict-vs-mistake/ https://slatestarcodex.com/2014/12/17/the-toxoplasma-of-rage/ https://www.overcomingbias.com/2009/09/this-is-the-dream-time.html

‘This is the dream time’ is definitely one of the best modern essays. Interestingly the two SSC essays you picked I found to be quite average for Scott Alexander, I thought some of his book reviews to be extraordinary, like ‘Albion’s Seed’. 

You mention that the essays here aren't in your top most persuasive. I will just note that an explicit goal of this forum is that we are asked to write to inform, not to persuade. I think that's really critical in maintaining a rationalist and a pleasant environment.

Just a random note. Thanks for the comparative analysis.

Contra "inform, not persuade", I remember reading Luke Muehlhauser's old post Rhetoric for the Good:

The topics of rationality and existential risk reduction need their own Richard Dawkins. Their own Darwin. Their own Voltaire.

Rhetoric moves minds.

Students and masochists aside, people read only what is exciting. So: Want to make an impact? Be exciting. You must be heard before you can turn heads in the right direction.

My sense is that Eliezer also consciously wrote persuasively as well; as a young LW lurker a decade ago it was that persuasiveness that kept me coming back. 

I'm hence somewhat surprised to see "an explicit goal of this forum is that we are asked to write to inform, not to persuade" quite highly upvoted and agreed with. I wonder what changed, or whether my initial perception was just wrong to begin with.

I think you're talking about outward-facing writing. I mean stuff meant to recruit new rationalists, not stuff directed at rationalists.

Also, there's no conflict between being exciting and writing to inform.

You mention that the essays here aren't in your top most persuasive. I will just note that an explicit goal of this forum is that we are asked to write to inform, not to persuade. I think that's really critical in maintaining a rationalist and a pleasant environment.

Quite a few writers have not gotten the memo then, if that is the consensus.

I do agree that there is a slight ‘creepiness vibe’ with the way the Sequences . . . are written.

IIRC back when he was writing the sequences, Eliezer said that he was psychologically incapable of writing in the typical dry academic manner. I.e., he wouldn't be able to bring himself to do so even if he knew doing so would improve the reception of his writing.

Maybe he used the word "stuffy".

Pretty much the only way I can get myself to post here is to write a draft of the post I actually want to write, then just post that draft, since otherwise I’ll sit on it forever

I know its a bit late to respond but your comment had me thinking for a bit. 

The typical academic writing style has certain specialized purposes, in the hands of experts, that Eliezer would almost certainly not have shared, or realized, when starting this forum. 

‘Stuffiness’ may even be a desired attribute. 

Though I do agree Eliezer seems to have an antipathy towards it and seemed to intentionally write in the opposite tone at the beginning. Although it was advantageous for attracting a wider audience this had the disadvantage of adding some unfortunate, even suspicious, undertones for an experienced and careful reader. 

Then again due to the constrained nature of online communication it’s easy to suspect an ulterior purpose behind anything unusual one encounters.

Thanks for the write-up.

A few lines I didn't quite get.

a distinctive writing style that does, in extreme cases, seem to be satire compared to the academic norm

I don't know what you're referring to here, an example would be helpful.

I do agree that there is a slight 'creepiness vibe' with the way the Sequences are presented by default on the home page

I don't know who you're agreeing with or why showing some of the best writing on the site seems creepy to you.

In practical terms I've found the average quality of writings on LW to match the best subreddits but worse than the best private forums.

I'd be somewhat more interested in you comparing the top 1-5% of the best subreddits and of LW? The average doesn't seem as interesting to me, because it falls off the frontpage much faster than the peak content.

Ideally I would link to concrete examples but I'm afraid it would come across as me calling out someone else, especially if they believe they put in their best effort in writing a serious essay, so I will have to leave it to your imagination.

The Sequences do contain some very good pieces of writing, but they also contain some that are not so, and perhaps it is an artifact of the time period and/or Eliezer's personal idiosyncrasies, but I can't honestly say I perceived all of it as entirely wholesome. For example, some of it comes across as more argumentative than necessary, some of it seems a bit too eager for recognition, and so on. Due to the nature of vibes, I'm not sure if I could provide a more convincing explanation.Then again I may just be an outlier.

The top 1% of writings on LW are definitely better than the top 1% of any subreddit I've seen. I think they are probably the largest collection of high quality writing by a pool of many dozens (hundreds?) I've seen anywhere online.

Ideally I would link to concrete examples but I'm afraid it would come across as me calling out someone else, especially if they believe they put in their best effort in writing a serious essay, so I will have to leave it to your imagination.

For the record, I think critiques can accurately describe someone's writing critically without being an unreasonable aggression, and I think critiques are much better for concreteness. I think your post would be 2-5x as valuable for me if I had concrete posts in mind as what you were pointing to, when you discuss old posts that were better than new posts, or posts that use jargon more than academia to an excessive degree.

Perhaps, though I have yet to see any successful examples of such a comparison. And it may be a moral hazard regardless with deleterious second, and higher, order effects.

Hm, I think there are lots of examples. First to come to mind is a recent reply to Eliezer by Holden, of which I think a severe criticism was respectfully described like this:

Something like half of this post is blockquotes. I've often been surprised by the degree to which people (including people I respect a lot, such as Eliezer in this case) seem to mischaracterize specific pieces they critique , and I try to avoid this for myself by quoting extensively from a piece when critiquing it.

And lines like this:

"Most of Eliezer's critique seems directed at assumptions the report explicitly does not make about how transformative AI will be developed, and more broadly, about the connection between its (the report's) compute estimates and all-things-considered AI timelines."

This appears to be mostly, if not entirely, concerned with the substantive content of a post, not the style or manner of writing. Style criticisms are a lot trickier and I'm not sure if it's possible to avoid hurt feelings one way or another.

Fair point.

For example, some of it comes across as more argumentative than necessary, some of it seems a bit too eager for recognition, and so on. Due to the nature of vibes, I'm not sure if I could provide a more convincing explanation.Then again I may just be an outlier.

 

I remember having similar impressions when first encountering Eliezer's writing. So if you are an outlier, you're not the only one.

I think his writing is written in a way that makes it captivating, illustrative, and convincing sounding, which ironically makes me more suspicious of it than if it were very dry and logical. 

The information environment is also more adversarial nowadays than what it was before, so the apparent deficiencies of LW need to be adjusted to the new environmental baseline to allow for a fair comparison. None of the critics, or opponents, seems to have done this, though this is almost never done for any type of criticism. Perhaps that says more about criticism, and critics themselves, than what they critique. 

So it may very well be that, relative to its environment, 2022 LW is actually superior to that of 2012 LW. Especially as the moderators have done a good job with maintaining the publicly accessible focus.

This is actually a very serious problem that will most likely get worse regardless of what galaxy-brained solutions are thought up and implemented. It's the sort of thing where no matter how deep you burrow down, someone will always be able to dig out your hole until they ferret you out, because digging a hole leaves a trail by default (the tunnel). It really begs the question of whether accepting continued decline on lesswrong.com is worth it to avoid the risk of making the issue even worse.

But I'm happy to work on solutions nonetheless, and I'm sure a lot of others are too. It's just that it has to be via direct messages instead of public posts, and it really ought to involve in-person communication taking the place of online communication. And we have to know that others exist who also care, instead of being emotionally filled with vague doubt that anyone is out there and that anything is worth doing.

It does seem highly doubtful true harmony will be established on the Earth in our lifetime. So any feasible solutions may indeed have to undergo a long period of development in private.

Hello M.Y.Zuo,


[...evaporative cooling effect...] This isn't very surprising as this is the common trajectory of every community that rapidly grows in size.

It would have taken a super-human effort to retain the same level of quality going from 100 to 1000 users, let alone from 1000 to 10000, and so on. So I don't think that would have been a fair expectation to place on the moderators, or anyone else involved, of a decade ago.

I agree that it must be a massive undertaking to go from 100 to 10000 users. However, I question why it should be a good thing, in and by itself. The question that would determine if growth is good or not, should be related to the goals and purpose of LW. (wrote more about it in my post
That would be akin to growth for growth's sake, which sounds weird to me as a metric of success.

If quality is even a tiny bit diluted, isn't that a net loss, regardless of the quantifiable gain in members? 

What I wish for is a conscious and intentional decision regarding what kind of 'temperature' one wants LW to have, and then strive to achieve that temperature.

Consequently, to talk about the evaporative cooling effect not as some force of nature, but the result of choices. The goal being not only to make the choices conscious, but also to make them coherent with a precise purpose and relevant sub-goals.

So far, I haven't seen any good answers to this, but I am new, so might be a lot I haven't read. However, the FAQ for LW is way too vague regarding where the site is headed. With this kind of growth, however, it can head in many different directions. So, which is it going to be?

I would also like to see the community keep going, but I would also like to see the issue of specifying purpose and aligning sub-goals with purpose be taken seriously. If the parameters of the community change, it is not longer the same community, even when it is bearing the same name. And from the birth of LW till now, there seems to have been a lot of changes, some of which I assume meant that LW fundamentally changed Identity. 

And instead of that just happening, isn't it better to actually say: This is what we want to achieve, this is how we will do it and what we will select when adding new members, and then work towards it? Then 'success' would be a measurable metric, instead of a vague and unknown one.


Kindly, but firmly,
Caerulea-Lawrence
 

What I wish for is a conscious and intentional decision regarding what kind of 'temperature' one wants LW to have, and then strive to achieve that temperature.
 

This may be possible via a private club type membership system, where there's a capped number of possible members, along with some kind of vetting process. Though I doubt the actual practical effects would be net positive, at least for a public forum.


This may be possible via a private club type membership system, where there's a capped number of possible members, along with some kind of vetting process. Though I doubt the actual practical effects would be net positive, at least for a public forum.


If 'public forum' is the right temperature for LW, it is not in opposition to my point. My argument is to make purpose, sub-goals and potential 'vetting' streamlined, in order to clarify success-metrics and also have a uniform direction, instead of a lot of unwritten and unconscious ideas forming the future LW. For users to be part of and to directly contribute to a clear goal, and not just serve some vague idea of something, and for this goal to also serve the users joining that specific goal. 

One issue with having a vague and broad goal, is that the goal would fit a long list of different iterations. With as broad a goal as it is now, the iteration Private club type with membership or anyone is welcome, isn't mutually excluded. But which of the iterations are LW specifically encouraging and supporting? What is the identity of LW?


LessWrong is a community dedicated to improving our reasoning and decision-making. We seek to hold true beliefs and to be effective at accomplishing our goals. More generally, we want to develop and practice the art of human rationality.

To that end, LessWrong is a place to 1) develop and train rationality, and 2) apply one’s rationality to real-world problems.
 

The needs of beginners, amateurs, intermediates, experts and masters are very specific when it comes to what constituents a useful and nurturing environment to 'develop and train rationality'. There might be some slight overlaps, for various reasons, but to "be effective at accomplishing our goals", there must be specific goals that need accomplishing. And the environment that enables one group to thrive, might be the opposite of what the other needs, and ignoring that is detrimental to both.

Meaning that LW might be 'the best place', but it isn't the best place because it is giving everyone on the different levels the unique environments they need to thrive, it only means that it is 'the best place' when compared to what is available. 
When people have left LW, I imagine them having recognized this, that even though it doesn't specify it explicitly, implicitly the needs of specific groups are ignored or not acknowledged. And when a site opens to a lot of new members, without specific selection, at some point there will be a very significant balance shift in one or two directions, creating dynamics and shifts in balance, even without any explicit changes in direction or form. 

I am arguing for a clear metric for measuring success, so that it is actually even possible to improve effectiveness in accomplishing goals. As it stands now, the goal of LessWrong is so vague, it is ruining functional reasoning and decision-making, simply through its lack of clarity and specifications. 


Kindly, but firmly,
Caerulea-Lawrence