Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality

Introduction

Less Wrong is explicitly intended is to help people become more rational.  Eliezer has posted that rationality means epistemic rationality (having & updating a correct model of the world), and instrumental rationality (the art of achieving your goals effectively).  Both are fundamentally tied to the real world and our performance in it - they are about ability in practice, not theoretical knowledge (except inasmuch as that knowledge helps ability in practice).  Unfortunately, I think Less Wrong is a failure at instilling abilities-in-practice, and designed in a way that detracts from people's real-world performance.

It will take some time, and it may be unpleasant to hear, but I'm going to try to explain what LW is, why that's bad, and sketch what a tool to actually help people become more rational would look like.

(This post was motivated by Anna Salomon's Humans are not automatically strategic and the response, more detailed background in footnote [1].)

Update / Clarification in response to some comments: This post is based on the assumption that a) the creators of Less Wrong wish Less Wrong to result in people becoming better at achieving their goals (instrumental rationality, aka "efficient productivity"), and b) Some (perhaps many) readers read it towards that goal.  It is this I think is self-deception.  I do not dispute that LW can be used in a positive way (read during fun time instead of the NYT or funny pictures on Digg), or that it has positive effects (exposing people to important ideas they might not see elsewhere).  I merely dispute that reading fun things on the internet can help people become more instrumentally rational.  Additionally, I think instrumental rationality is really important and could be a huge benefit to people's lives (in fact, is by definition!), and so a community value that "deliberate practice towards self-improvement" is more valuable and more important than "reading entertaining ideas on the internet" would be of immense value to LW as a community - while greatly decreasing the importance of LW as a website.

Why Less Wrong is not an effective route to increasing rationality.

Definition:

Work: time spent acting in an instrumentally rational manner, ie forcing your attention towards the tasks you have consciously determined will be the most effective at achieving your consciously chosen goals, rather than allowing your mind to drift to what is shiny and fun.

By definition, Work is what (instrumental) rationalists wish to do more of.  A corollary is that Work is also what is required in order to increase one's capacity to Work.  This must be true by the definition of instrumental rationality - if it's the most efficient way to achieve one's goals, and if one's goal is to increase one's instrumental rationality, doing so is most efficiently done by being instrumentally rational about it. [2]

That was almost circular, so to add meat, you'll notice in the definition an embedded assumption that the "hard" part of Work is directing attention - forcing yourself to do what you know you ought to instead of what is fun & easy.  (And to a lesser degree, determining your goals and the most effective tasks to achieve them).  This assumption may not hold true for everyone, but with the amount of discussion of "Akrasia" on LW, the general drift of writing by smart people about productivity (Paul Graham: Addiction, Distraction, Merlin Mann: Time & Attention), and the common themes in the numerous productivity/self-help books I've read, I think it's fair to say that identifying the goals and tasks that matter and getting yourself to do them is what most humans fundamentally struggle with when it comes to instrumental rationality.

Figuring out goals is fairly personal, often subjective, and can be difficult.  I definitely think the deep philosophical elements of Less Wrong and it's contributions to epistemic rationality [3] are useful to this, but (like psychedelics) the benefit comes from small occasional doses of the good stuff.  Goals should be re-examined regularly, but occasionally (roughly yearly, and at major life forks).  An annual retreat with a mix of close friends and distant-but-respected acquaintances (Burning Man, perhaps) will do the trick - reading a regularly updated blog is way overkill.

And figuring out tasks, once you turn your attention to it, is pretty easy.  Once you have explicit goals, just consciously and continuously examining whether your actions have been effective at achieving those goals will get you way above the average smart human at correctly choosing the most effective tasks.  The big deal here for many (most?) of us, is the conscious direction of our attention.

What is the enemy of consciously directed attention?  It is shiny distraction.  And what is Less Wrong?  It is a blog, a succession of short fun posts with comments, most likely read when people wish to distract or entertain themselves, and tuned for producing shiny ideas which successfully distract and entertain people.  As Merlin Mann says: "Joining a Facebook group about creative productivity is like buying a chair about jogging".  Well, reading a blog to overcome akrasia IS joining a Facebook group about creative productivity.  It's the opposite of this classic piece of advice.

Now, I freely admit that this argument is relatively brief and minimally supported compared to what a really good, solid argument about exactly how to become more rational would be.  This laziness is deliberate, conscious, and a direct expression of my beliefs about the problem with LW.  I believe that most people, particularly smart ones, do way too much thinking & talking and way too little action (me included), because that is what's easy for them [4].

What I see as a better route is to gather those who will quickly agree, do things differently, (hopefully) win and (definitely) learn.  Note that this general technique has a double advantage: the small group gets to enjoy immediate results, and when the time comes to change minds, they have the powerful evidence of their experience.  It also reduces the problem that the stated goal of many participants ("get more rational") may not be their actual goal ("enjoy the company of rationalists in a way which is shiny fun, not Work"), since the call to action will tend to select for those who actually desire self-improvement.  My hope is that this post and the description below of what actual personal growth looks like inspire one or more small groups to form.

Less Wrong: Negative Value, Positive Potential

Unfortunately, in this framework, Less Wrong is probably of negative value to those who really want to become more rational.  I see it as a low-ROI activity whose shininess is tuned to attract the rationality community, and thus serves as the perfect distraction (rationality porn, rationality opium).  Many (most?) participants are allowing LW to grab their attention because it is fun and easy, and thus simultaneously distracting themselves from Work (reducing their overall Work time) while convincing themselves that this distraction is helping them to become more rational.  This reduces the chance that they will consciously Work towards rationality, since they feel they are already working towards that goal with their LW reading time. (Adding [4.5] in response to comments).

(Note that from this perspective, HP&TMoR is a positive - people know reading fanfic is entertainment, and being good enough entertainment to displace people's less educational alternative entertainments while teaching a little rationality increases the overall level of rationality.  The key is that HP&TMoR is read in "fun time", while I believe most people see LW time as "work towards self-improvement" time.  Ironic, but true for me and the friends I've polled, at least)

That said, the property of shininess-to-rationalists has resulted in a large community of rationalists, which makes LW potentially a great resource for actual training of people's individual rationality.  And while catalyzing Work is much harder than getting positive feedback, I do find it heart-warming and promising that I have consistently received positive feedback from the LW community by pointing out it's errors.  This is a community that wants to self-correct - which is unfortunately rare and a necessary though not sufficient criteria for improvement.

This is taking too long to write [5], and we haven't even gotten to the constructive part, so I'm going to assume that if you are still with me you no longer need as detailed arguments and I can go faster.

Some Observations On What Makes Something Useful For Self-Improvement

My version: Growth activities are Work, and hence feel like work, not fun - they involves overriding your instincts, not following them.  Any growth you can get from following your instincts, you have probably had already.  And consciously directing your attention is not something that can be trained by being distracted (willpower is a muscle, you exercise it by using it).  Finding the best tasks to achieve your goals is not practiced by doing whatever tasks come to mind.  And so forth.  You may experience flow states once your attention is focused where it should be, but unless you have the incredible and rare fortune to have what is shiny match up with what is useful, the act of starting and maintaining focus and improving your ability to do so will be hard work.

The academic version: The literature on skill development ("acquisition of expertise") says that it involves "deliberate practice".  The same is very likely true of acquiring expertise in rationality.  The 6 tenets of deliberate practice are that it:

  1. Is not inherently enjoyable.
  2. Is not play or paid practice.
  3. Is relevant to the skill being developed.
  4. Is not simply watching the skill being performed.
  5. Requires effort and attention from the learner.
  6. Often involves activities selected by a coach or teacher to facilitate learning.

One must stretch quite a bit to fit these to "reading Less Wrong" - it's just too shiny and fun to be useful.  One can (and must) enjoy the results of practice, but if the practice itself doesn't take effort, you are going to plateau fast.  (I want to be clear, BTW, that I am not making a Puritan fallacy of equating effort and reward [6]).  Meditation is a great example of an instrumental rationality practice: it is a boring, difficult isolation exercise for directing and noticing the direction of one's attention.  It is Work.

What Would A Real Rationality Practice Look Like?

Eliezer has used the phrase "rationality dojo", which I think has many correct implications:

  1. It is a group of people who gather in person to train specific skills.
  2. While there are some theoreticians of the art, most people participate by learning it and doing it, not theorizing about it.
  3. Thus the main focus is on local practice groups, along with the global coordination to maximize their effectiveness (marketing, branding, integration of knowledge, common infrastructure).  As a result, it is driven by the needs of the learners.
  4. You have to sweat, but the result is you get stronger.
  5. You improve by learning from those better than you, competing with those at your level, and teaching those below you.
  6. It is run by a professional, or at least someone getting paid for their hobby.  The practicants receive personal benefit from their practice, in particular from the value-added of the coach, enough to pay for talented coaches.

In general, a real rationality practice should feel a lot more like going to the gym, and a lot less like hanging out with friends at a bar.

To explain the ones that I worry will be non-obvious:

1) I don't know why in-person group is important, but it seems to be - all the people who have replied to me so far saying they get useful rational practice out of the LW community said the growth came through attending local meetups (example).  We can easily invent some evolutionary psychology story for this, but it doesn't matter why, at this point it's enough to just know.

6) There are people who can do high-quality productive work in their spare time, but in my experience they are very rare.  It is very pleasant to think that "amateurs can change the world" because then we can fantasize about ourselves doing it in our spare time, and it even happens occasionally, which feeds that fantasy, but I don't find it very credible.  I know we are really smart and there are memes in our community that rationalists are way better than everyone else at everything, but frankly I find the idea that people writing blog posts in their spare time will create a better system than trained professionals for improving one's ability to achieve one's goals to be ludicrous.  I know some personal growth professionals, and they are smart too, and they have had years of practice and study to develop practical experience.  Talk is cheap, as is time spent reading blogs: if people actually value becoming more rational, they will pay for it, and if there are good teachers, they will be worth being paid.  Money is a unit of learning [7].

There are some other important aspects which such a practice would have that LW does not:

  1. The accumulation of knowledge.  Blogs are inherently rewarding: people read what is recent, so you get quick feedback on posts and comments.  However, they are inherently ephemeral for the same reason - people read what is recent, and posts are never substantially revised.  The voting system helps a little, but it can't even close to fix the underlying structure.  To be efficient, much less work should go into ephemeral posts, and much more into accumulating and revising a large, detailed, nuanced body of knowledge (this is exactly the sort of ""work not fun" activity that you can get by paying someone, but are unlikely to get when contributors are volunteers).  In theory, this could happen on the Wiki, but in practice I have rarely seen Wikis succeed at this (with the obvious except of Le Wik).
  2. It would involve more literature review and pointers to existing work.  The obvious highest-ROI way to start working on improving instrumental rationality is to research and summarize the best existing work for self-improvement in the directions that LW values, not to reinvent the wheel.  Yes, over time LW should produce original work and perhaps eventually the best such work, but the existing work is not so bad that it should just be ignored.  Far from it!  In reference to (1), perhaps this should be done by creating a database of reviews and ratings by LWers of the top-rated self-improvement books, perhaps eventually with ratings taking into account the variety of skills people are seeking and ways in which they optimally learn.
  3. It would be practical - most units of information (posts, pages, whatever) would be about exercises or ideas that one could immediately apply in one's own life.  It would look less like most LW posts (abstract, theoretical, focused on chains of logic), and more like Structured Procrastination, the Pmarca Guide To Personal Productivity, books like Eat That Frog!, Getting Things Done, and Switch [8].  Most discussion would be about topics like those in Anna's post - how to act effectively, what things people have tried, what worked, what didn't, and why.  More learning through empiricism, less through logic and analysis.

In forming such a practice, we could learn from other communities which have developed a new body of knowledge about a set of skills and disseminated it with rapid scaling within the last 15 years.  Two I know about and have tangentially participated in are

  1. PUA (how to pick up women).  In fact, a social skills community based on PUA was suggested on LW a few days ago - (glad to see that others are interested in practice and not just talk!)
  2. CrossFit (synthesis of the best techniques for time-efficient broad-applicability fitness)

Note that both involve most of my suggested features (PUA has some "reading not doing" issues, but it's far ahead of LW in having an explicit cultural value to the contrary - for example, almost every workshop features time spent "in the field").  One feature of PUA in particular I'd like to point out is the concept of the "PUA lair" - a group of people living together with the explicit intention of increasing their PUA skills.  As the lair link says: "It is highly touted that the most proficient and fastest way to improve your skills is to hang out with others who are ahead of you, and those whose goals for improvement mirror your own." [9]

Conclusion

If LW is to accomplish it's goal of increasing participant's instrumental rationality, it must dramatically change form.  One of the biggest, perhaps the biggest element of instrumental rationality is the ability to direct one's attention, and a rationality blog makes people worse at this by distracting their attention in a way accepted by their community and that they will feel is useful.  From The War Of Art [10]:

Often couples or close friends,even entire families, will enter into tacit compacts whereby each individual pledges (unconsciously) to remain mired in the same slough in which she and all her cronies have become so comfortable.  The highest treason a crab can commit is to make a leap for the rim of the bucket.

To aid growth at rationality, Less Wrong would have to become a skill practice community, more like martial arts, PUA, and physical fitness, with an explicit focus of helping people grow in their ability to set and achieve goals, combining local chapters with global coordination, infrastructure, and knowledge accumulation.  Most discussion should be among people working on a specific skill at a similar level about what is or isn't working for them as they attempt to progress, rather than obscure theories about the inner workings of the human mind.

Such a practice and community would look very different, but I believe it would have a far better chance to actually make people more rational [11].  There would be danger of cultism and the religious fervor/"one true way" that self-help movements sometimes have (Landmark), and I wonder if it's a profound distaste for anything remotely smelling of cult that has led Eliezer & SIAI away from this path.  But the opposite of cult is not growth, it is to continue being an opiate for rationalists, a pleasant way of making the time pass that feels like work towards growth and thus feeds people's desire for guiltless distraction.

To be growth, we must do work, people must get paid, we must gather in person, focus on action not words, put forth great effort over time to increase our capacity, use peak experiences to knock people loose from ingrained patterns, and copy these and much more from the skill practice communities of the world.  Developed by non-rationalists, sure, but the ones that last are the ones that work [12] - let's learn from their embedded knowledge.

Addendum

That was 5 hours of my semi-Work time, so I really hope it wasn't wasted, and that some of you not only listen but take action.  I don't have much free time for new projects, but if people want to start a local rationality dojo in Mountain View/Sunnyvale, I'm in.  And there is already talk, among some reviewers of this draft, of putting together an introductory workshop.  Time will tell - and the next step is up to you.

Footnotes

[1] Anna Salomon posted Humans are not automatically strategic, a reply to the very practical A "Failure to Evaluate Return-on-Time" Fallacy.  Anna's post laid out a nice rough map at what an instrumentally rational process for goal achievement would look like (consciously choosing goals, metrics, researching solutions, experimenting with implementing them, balancing exploration & exploitation - the basic recipe for success at anything), said she was keen to train this, and asked:

So, to second Lionhearted's questions: does this analysis seem right?  Have some of you trained yourselves to be substantially more strategic, or goal-achieving, than you started out?  How did you do it?  Do you agree with (a)-(h) above?  Do you have some good heuristics to add?  Do you have some good ideas for how to train yourself in such heuristics?

After reading the comments, I made a comment which began:

I'm disappointed at how few of these comments, particularly the highly-voted ones, are about proposed solutions, or at least proposed areas for research. My general concern about the LW community is that it seems much more interested in the fun of debating and analyzing biases, rather than the boring repetitive trial-and-error of correcting them.

Anna's post was upvoted into the top 10 all-time on LW in a couple days, and my comment quickly became the top on the post by a large margin, so both her agenda and my concern seem to be widely shared.  While I rarely take the time to write LW posts (as you would expect from someone who believes LW is not very useful), this feedback gave me hope that there might be enough untapped desire for something more effective that a post might help catalyze enough change to be worthwhile.

[2] There are many other other arguments as to why improving one's ability to do work is unlikely to be fun and easy, of course.  With a large space of possible activities, and only a loose connection between "fun" and "helps you grow" (via evolutionary biology), it seems a priori unlikely that fun activities will overlap with growthful ones.  And we know that a general recipe for getting better at X is to do X, so if one wants to get better at directing one's attention to the most important tasks and goals, it seems very likely that one must practice directing one's attention.  Furthermore, there is evidence that, specifically, willpower is a muscle.  So the case for growing one's instrumental rationality through being distracted by an entertaining rationality blog is...awfully weak.

[3] What are the most important problems in the world?  Who is working most effectively to fix them and how can you help?  Understanding existential risks is certainly not easy, and important to setting that portion of your goals that has to do with helping the world - which is a minor part of most people's goals, which are about their own lives and self-interest.

[4] I also believe the least effective form of debate is trying to get people to change their minds.  Therefore, an extensive study and documentation to create a really good, solid argument trying to change the minds of LWers who don't quickly agree with my argument sketch would be a very low-return activity compared to getting together those who already agree and doing an experiment.  And instrumental rationality is about maximizing the return on your activities, given your goals, so I try to avoid low-return activities.

[4.5] A number of commenters state that they consciously read LW during fun time, or read it to learn about biases and existential risk, not to become more rational, in which case it is likely of positive value.  If you have successfully walled off your work from shiny distractions, then you are advanced in the ways of attention and may be able to use this particular drug without negative effects, and I congratulate you.  If you are reading it to learn about topics of interest to rationalists and believe that you will stop there and not let it affect your productivity, just be warned that many an opiate addiction has begun with a legitimate use of painkillers.

Or to go back to Merlin's metaphor: If you buy a couch to sit on and watch TV, there's nothing wrong with that.  You might even see a sports program on TV that motivates you to go jogging.  Just don't buy the couch in order to further your goal of physical fitness.  Or claim that couch-buyers are a community of people committed to becoming more fit, because they sometimes watch sports shows and sometimes get outside.  Couch-buyers are a community of people who sit around - even if they watch sports programs.  Real runners buy jogging shoes, sweat headbands, GPS route trackers, pedometers, stopwatches...

[5] 1.5 hrs so far.  Time tracking is an important part of attention management - if you don't know how your time is spent, it's probably being spent badly.

[6] Specifically, I am not saying that growth is never fun, or that growth is proportional to effort, only that there are a very limited number of fun ways to grow (taking psychedelics at Burning Man with people you like and respect) and you've probably done them all already.  If you haven't, sure, of course you should do them, and yes, of course discovering & cataloging such things is useful, but there really aren't very many so if you want to continue to grow you need to stop fooling yourself that reading a blog will do it and get ready to make some effort.

[7] Referencing Eliezer's great Money: The Unit of Caring, of course.  I find it ironic that he understand basic economics intellectually so well as to make one of the most eloquent arguments for donating money instead of time that I've ever seen, yet seems to be trying to create a rationality improvement movement without, as far as I can tell, involving any specialists in the art of human change or growth.  That is, using the method that grownups use.  What you do when you want something to actually get done.  You use money to employ full-time specialists.

[8] I haven't actually read this one yet, but their other book, Made To Stick, was an outstanding study of memetic engineering so I think it very likely that their book on habit formation is good too.

[9] Indeed.  I happen to have a background of living in and founding intentional communities (Tortuga!), and in fact currently rent rooms to LWers Divia and Nick Tarleton, so I can attest to the value of one's social environment and personal growth goals being synchronized.  Benton House is likely an example as well.  Groups of rationalists living together will automatically practice, and have that practice reinforced by their primate desire for status within the group, this is almost surely the fastest way to progress, although not required or suited to everyone.

[10] The next paragraph explains why I do my best not to spend much time here:

The awakening artist must be ruthless, not only with herself but with others.  Once you make your break, you can’t turn around for your buddy who catches his trouser leg on the barbed wire.  The best thing you can do for that friend (and he’d tell you this himself, if he really is your friend) is to get over the wall and keep motating.

Although I suppose I am violating the advice by turning around and giving a long speech about why everyone else should make a break too :).  My theory is that by saying it right once, I can refrain from wasting any more time saying it again in the future, should this attempt not work.  But that may just be rationalizing.  On the other hand, doing things "well or not at all" is rational in situations where the return curve is steep.  Given my low evaluation of LW's usefulness, I obviously think the early part of the return curve is basically flat zero.  We will see if it is hubris to think the right post can really make a difference, and that I can make that post.  Certainly plenty of opportunity for bias in both those statements.

[11] Note that helping people become personally more effective is a much easier meme to spread than helping people better understand how to contribute to public goods (ie how to better understand efficient charity and existential risk).  They have every incentive to do the former and little incentive to do the latter.  So training people in general goal achievement (instrumental rationality) is likely to have far broader appeal and reach far more people than training them in the aspects of epistemic rationality that SIAI is most interested in.  This large community who have grown through the individually beneficial part of the philosophy is then a great target market for the societally beneficial part of the philosophy.  (A classic one-two punch used by spiritual groups, of course: provide value then teach values.  It works.  If rationalists do what works...)  I've been meaning to make a post on the importance of personal benefit to spreading memes for awhile, this paragraph will have to do for now...

[12] And the ones with good memetic engineering, including use of the Dark Arts.  Many difficult decisions will need to be made about what techniques are and aren't Dark Arts and which are worth using anyway.  The fact remains that just like a sports MVP is almost certainly both more skilled and more lucky than his peers, a successful self-help movement is almost certainly both more effective at helping people and better memetically engineered than its peers.  So copy - but filter.

251 comments, sorted by
magical algorithm
Highlighting new comments since Today at 1:07 AM
Select new highlight date
Moderation Guidelinesexpand_more

To get more meta, not only has Less Wrong not produced "results", but all the posts saying Less Wrong needs to produce more "results" (example: Instrumental Rationality Is A Chimera) haven't produced any results. Even though most people liked the idea in that recent PUA thread, I don't see any concrete moves in that direction either.

Most of these threads have been phrased along the lines of "Someone really ought to do something about this", and then everyone agrees that yeah, they should, and then nothing ever comes out of it. That's a natural phenomenon in an anarchy where no one is the Official Doer of Difficult Things That Need To Be Done. Our community has one leader, Eliezer, and he has much better things to do with his time. Absent a formal organization, no one is going to be able to move a few hundred people to do things differently.

But small interventions can have major changes on behavior (see the sentence beginning with "I was reminded of this recently..." here). For example, I think if there were socialskills.lesswrong.com and health.lesswrong.com subcommunities linked to the top of the page, they would auto-populate with a community and interesting posts. I would love to see a discussion forum on nootropics where people can post their experiences and questions in an organized and easy to find way, for example. This idea has been brought up since forever and no one has ever done anything about it. The alternate idea, that we make a bulletin board in which these things can be done easily and naturally (AND WHICH CAN HANDLE OPEN THREADS IN A SANE WAY) has also been brought up since forever and no one has done anything about it (one person made a bulletin board back in the Overcoming Bias days, but no one used it. Go figure.)

So I propose the following:

  1. Community norm against saying "It would be nice if someone in our community did X" if you have no particular plans to do X and no reason to think anyone else will.

  2. Poll on whether people want a bulletin board or subreddits. This poll is below this comment.

  3. If people want a bulletin board, and they promise to actually use it once it is made, and Eliezer and Tricycle don't want to make it themselves, and no one else more competent with computers will make it, I will make and host it (maybe. I'm not sure how much traffic it would get and I don't want to commit to something that would bankrupt me. But in principle, yes.)

  4. I don't know how to program subreddits, but if that solution wins the poll, I will pay someone who does know a small amount of money to do it, and other people probably will too (because we will do the fundraising in a rationalist way!) adding up to a medium amount of money.

Upvote this if, out of the solution set [keep things they way they are, have subreddits, have bulletin board], you would prefer to have subreddits.

I would prefer subreddits, and would match a consensus donation at up to $10 on pledgebank.

This won the poll, so I'm going to talk to some people and see how it would get done and what it would take. I'll report back when I get answers, maybe in an Open Thread or somewhere.

AND WHICH CAN HANDLE ANY THREADS IN A SANE WAY

Fixed.

My only feasible solution to follow rapidly developing discussions is still to read the recent comments, rather than the thread (due to all the thread and which bleeds onto continued pages)... basically a full table scan of LessWrong. The flat view of comments is better for avoiding missing something, even with comments on other posts thrown in.

It would be nice to have a recent comments link for specific threads.

Upvoted for agreement. Even better, add subscription flags to threads, and provide a recent comments view that shows only the subscribed threads.

Poking around at the source tree, this seems to be the current CMS template for the global recent comments page. As far as I can tell, the query for listing the comments is here.

A quick hack solution would be to add a second comment query where comments from posts one isn't interested in are filtered out of the all comments query before the list is returned.

I've been suggesting developing trn capabilities for a while, but that would be a big job. Adding the most valuable aspects as needed probably makes more sense.

Upvote this if, out of the solution set [keep things they way they are, have subreddits, have bulletin board], you like the way things are now.

Even though most people liked the idea in that recent PUA thread, I don't see any concrete moves in that direction either.

Seriously? That's a pretty quick judgement! I wrote most of a follow-up post, but I'm going to reevaluate it a bit in light of Patri's article.

I strongly support proposal 1, and I'd welcome some monitoring to make sure I don't violate this new norm.

If the subreddits idea wins, I will also chip in for the technical cost. Social.lesswrong.com seems like a decent way to do the thing-that-isn't-PUA.

You're right, I was rounding you to the nearest cliche of the last few people who said this sort of thing, and I was wrong.

Forgive me if I'm just being oblivious, but did anything end up happening on this?

I messaged Eliezer several times about this and he never got back to me. I talked to Tricycle, they said they were working on something, and what ended up happening was the split between Discussion and Main. This was not quite what I wanted, but given my inability to successfully contact Eliezer at the time I gave up.

Personally, I would say there has been very clear progress between 2010 and now, though I suppose if you don't think much of CFAR you might suppose otherwise.

Progress, yes, but I'm not seeing anything quite on the level of the call to action presented here. The argument isn't that LessWrong isn't useful, but that it is operating without the recursive return on its investments that would benefit it so much more than the current (slowly advancing) practices.

I certainly don't think we're "there yet," but it seems somewhat uncharitable to say that nothing ended up happening. I also don't think the final stage of rationality practice/training will look like a martial arts dojo in almost any respect.

I'm sorry, but creating subreddits is too trivial a task that would bootstrap this specific advancement to overlook. The only way to offset this oversight is if the administrators were trying to perform some kind of "test" to see if the community can work around the problem, but that's really stretching it. I fault the entire system regardless. I suppose I don't disagree that it is somewhat uncharitable, but the advancements that have been made aren't ...

Looking over your submission history, I can see what's happening here. You are advancing and improving, and writing posts about it, with those posts being received well, but the reception is far from effective. There are any number of psychological tendencies in place to cause you to inaccurately project your own advancements onto your peers. The truth is Eliezer_Yudkowsky has already embedded a ton of these lessons in the sequences over and over again. You're stating them more formally and circling the deeper ubiquitous causes of specific individual opinions here and there, but you've yet to make the post that resonates with the community and starts breaking some of the heavier cognitive barriers in place whose side-effects you've been formalizing.

It's all well and good, you're doing well, and your effort is paying off, and the community is advancing. Some of us are just getting really impatient with how slowly LessWrong refines itself in the immediate presence of so much rationality optimizing knowledge.

I honestly expected my comments back here three years in the past to go unnoticed for some time. That people still pay attention to these events is surprising. That you took the time to reply was surprising, and while I recognized your name as the author of one of the recent LessWrong-advancing posts, I didn't properly think of the full implications until now. As long as you're paying attention across time, I might as well point out to you that nobody else is. I was going to focus on getting this article bumped tomorrow, but if you are already here now, I might as well simply suggest you start thinking about an article about visiting the past posts of LessWrong.

I was going to focus on getting this article bumped tomorrow, but if you are already here now, I might as well simply suggest you start thinking about an article about visiting the past posts of LessWrong.

I'd suggest that you go along with this anyway-- while I have an article in the works that deals with some of these matters, it won't be forthcoming for some time.

Karma Score: -8

My own attempt at an article would be something vastly different, encompassing issues in such a way that article revival (anti-forgetfulness) would be a more apparent issue in need of being addressed. That's just one aspect in a deeper pool of cognitive shortcomings that I aim to empty significantly. But first I need to acquire a more detailed picture of exactly what set of biases exist in that pool, so as to trip only the ones that produce a productive pattern of thought when activated. More or less, I need to (l)earn the karma.

Article/thought re-ignition is simply an immediate and (presumably) "easily" communicable step that would produce powerful results; this community is sitting on a gold mine of cognition just waiting to be used.

  1. Didn't Sarah C just have a big post about this as a fallacy?

Most of these threads have been phrased along the lines of "Someone really ought to do something about this", and then everyone agrees that yeah, they should, and then nothing ever comes out of it. That's a natural phenomenon in...

I think it's a natural phenomenon on a blog - a format which is so anti-growth, so focused on shininess, that even energy towards productive change, when directed through the blog, goes nowhere. One big reason is the community norm of this all being free stuff done in spare time (except for Eliezer). Helping people grow, and designing curricula for and monitoring their growth, is hard work. It requires professional time and getting paid.

I do X all the time in my life and in my organization. The question is whether someone will take the time to create X for others. I am happy to participate in figuring out how to do X by supplying some of my very limited time. I will pay for X (workshops, coaching, or instruction), if X is taught more effectively from this community than from the many other places offering to help me grow and become better at achieving my goals. That demand will create it's own supply.

Re: subreddits & bulletin boards - Great, more shiny ways to waste people's time. Real change happens from what you do off the internet, is that so hard an idea to understand?

  1. I have re-read the Affect Heuristic post, and I don't see its relevance. Explain?

One of the last posts on this sort of thing mentioned the phrase "'Good enough' is the enemy of 'at all'".

Yes, the best way to do this would have in-person groups with paid instructors. I interpreted you as saying we should go create these groups. If your point was that these groups already exist and we should get off Less Wrong and go to them, then I misunderstood, but I am still doubtful. The vast majority of people don't have access to them (live in smaller cities without such groups, don't have time for such groups, et cetera), those who do probably don't know it,