Introduction To Lesswrong Subculture

Ruby3y30

Hey, sorry that you came across this instead of the current welcome/about page. I agree with much of your feedback here, glad the Welcome/About page does meet the need.

I added a note to this page saying it was written in 2015 (by one particular user, as you'll see in the history). So we've got it for historical reasons, but I also wouldn't use it as an intro.

This page was written in 2015 and imported from the old LessWrong wiki in 2020. If this page is your first exposure to LessWrong, we recommend you starting with Welcome to LessWrong! which serves as the up-to-date About and Welcome page.

(Update: I just saw the post Welcome to LessWrong!, and I think that that serves my needs well.)
 

I think it's good that a page like this exists; I'd want to be able to use it as a go-to link when suggesting people engage with or post on LessWrong, e.g. in my post on Notes on EA-related research, writing, testing fit, learning, and the Forum.

Unfortunately, it seems to me that this page isn't well suited to that purpose. Here are some things that seem like key issues to me (maybe other people would disagree):

  • This introduction seems unnecessarily intimidating, non-welcoming, and actually (in my perception) somewhat arrogant. For example:
    • "If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you'll encounter:"
      • This feels to me like saying "We're very special and you need to do your homework to deeply understand us before interacting at all with us, or you're just wasting our time and we'll want you to go away."
      • I do agree that the rationalist culture can take some getting used to, but I don't think it's far more complex or unusual than the cultures in a wide range of other subcultures, and I think it's very often easiest to get up to speed with a culture partly just by interacting with it.
      • I do agree that reading parts of the Sequences is useful, and that it's probably good to gently encourage new users to do that. But I wouldn't want to make it sound like it's a hard requirement or like they have to read the whole thing. And this passage will probably cause some readers to infer that, even if it doesn't outright say it. (A lot of people lurk more than they should, have imposter syndrome, etc.)
        • I started interacting on LessWrong before having finished the Sequences (though I'd read some), and I think I both got and provided value from those interactions.
      • Part of this is just my visceral reaction to any group saying their way of thinking and subculture is "extremely, extremely complex", rather than me having explicit reasons to think that that's bad.
  • I wrote all of that before reading the next paragraphs, and the next paragraphs very much intensified my emotional feeling of "These folks seem really arrogant and obnoxious and I don't want to ever hang out with them"
    • This is despite the fact that I've actually engaged a lot on LessWrong, really value a lot about it, rank the Sequences and HPMOR as among my favourite books, etc.
  • Maybe part of this is that this is describing what rationalists aim to be as if all rationalists always hit that mark.
    • Rationalists and the rationalist community often do suffer from the same issues other people and communities do. This was in fact one of the really valuable things Eliezer's posts pointed out (e.g., being wary of trending towards cult-hood).

Again, these are just my perceptions. But FWIW, I do feel these things quite strongly. 

Here are a couple much less important issues:

  • I don't think I'd characterise the Sequences as "mostly like Kahneman, but more engaging, and I guess with a bit of AI etc." From memory, a quite substantial chunk of the sequences - and quite a substantial chunk of their value - had to do with things other than cognitive biases, e.g. what goals one should form, why, how to act on them, etc. Maybe this is partly a matter of instrumental rather than just epistemic rationality.
    • Relatedly, I think this page presents a misleading or overly narrow picture of what's distinctive (and good!) about rationalist approaches to forming beliefs and choosing decisions when it says "There are over a hundred cognitive biases that humans are affected by that rationalists aim to avoid. Imagine you added over one hundred improvements to your way of thinking."
  • "Kahneman is notoriously dry" feels like an odd thing to say. Maybe he is, but I've never actually heard anyone say this, and I've read one of his books and papers and watched one of his talks and found them all probably somewhat more engaging than similar things from the average scientist. (Though maybe this was more the ideas themselves, rather than the presentation.)

(I didn't read "Website Participation Intro or "Why am I being downvoted?"", because it was unfortunately already clear that I wouldn't want to link to this page when aiming to introduce people to LessWrong and encourage them to read, comment, and/or post there.)

3MichaelA3y
(Update: I just saw the post Welcome to LessWrong!, and I think that that serves my needs well.)  
3Ruby3y
Hey, sorry that you came across this instead of the current welcome/about page. I agree with much of your feedback here, glad the Welcome/About page does meet the need. I added a note to this page saying it was written in 2015 (by one particular user, as you'll see in the history). So we've got it for historical reasons, but I also wouldn't use it as an intro.
Created by wedrifid at 4y

To resist Dunning–Kruger effect (mistakenly believing you know more about a subject than you do, possibly because you simply weren't clued in to how vast the subject is), and to make the depth and breadth of this subculture seem more real to you, you could begin by browsing the titles of articles in the Sequences. That's the closest thing there currently is to an index of the subculture. That can be found here: Sequences.

5. Don't expect people to be perfect rationalists, not even yourself.

Above all, remember that nobody is a perfect rationalist. You're going to make mistakes, and you're going to find reasoning errors that other members have made. You may not be able to fix other people's irrationality, but you can keep an eye out for your own mistakes. None of us were taught to think rationally in school, and we've all been steeped in beliefs that were grown, defended, selected and mutated by countless irrational decision-makers. Becoming a group of perfect rationalists would take a very long time and may not be a realistic goal. Our common goal is to refine ourselves to become less and less wrong by working together. If you get the urge to tear someone's reputation to little bits, please remember this: We've all inherited quite the ideological mess, and we're all working on this mess together. Don't expect others to be perfect rationalists. They can't be perfect but most of them do desire to be more rational.

6. Don't help us be less wrong too much.

Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead: [PoliticalPolitical Skills which Increase Incomehttp://lesswrong.com/lw/jsp/political_skills_which_increase_income/]

6. Don't expect people to be perfect rationalists, not even yourself.

Above all, remember that nobody is a perfect rationalist. You're going to make mistakes, and you're going to find reasoning errors that other members have made. You may not be able to fix other people's irrationality, but you can keep an eye out for your own mistakes. None of us were taught to think rationally in school, and we've all been steeped in beliefs that were grown, defended, selected and mutated by countless irrational decision-makers. Becoming a group of perfect rationalists would take a very long time and may not be a realistic goal. Our common goal is to refine ourselves to become less and less wrong by working together. If you get the urge to tear someone's reputation to little bits, please remember this: We've all inherited quite the ideological mess, and we're all working on this mess together. Don't expect others to be perfect rationalists. They can't be perfect but most of them do desire to be more rational.

Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead.instead: [Political Skills which Increase Income http://lesswrong.com/lw/jsp/political_skills_which_increase_income/]

Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone"Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead.

Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone"Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead.

The main document that has influenced LessWrong subculture is Thethe Sequences. The main theme of Thethe Sequences may be rationality but there are many other themes in Thethe Sequences which have influenced the subculture as well. These other themes may be why the subculture has attracted a disproportionate number of software engineers, math and science oriented individuals, people with an interest in artificial intelligence, atheists, etc. More importantly, Thethe Sequences also contain a lot of articles with Eliezer's ideas about rationalist culture. If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you'll encounter:

To resist Dunning–Kruger effect (mistakenly believing you know more about a subject than you do, possibly because you simply weren't clued in to how vast the subject is), and to make the depth and breadth of this subculture seem more real to you, you could begin by browsing the titles of articles in Thethe Sequences. That's the closest thing there currently is to an index of the subculture. That can be found here: The Sequences

3. It takes a very long time to become good at being rational. To be a really good reasoner, you need to patch over a hundred of cognitive biases. Rationalists improve their rationality because it's necessary if you want to make good decisions. Good decisions are, of course, necessary if you want a high degree of success in life and learning about biases is necessary just to help you avoid self-destructive decisions.decisions. Becoming more rational requires an investment. There is no quick fix. There is a lot to learn. Until you've invested a lot into learning, many of the people you'll encounter in the subculture will know a lot more about this than you do. Interacting with this subculture isn't like talking about a couple dozen bands and enjoying the same music. Daniel Kahneman's book "Judgment under Uncertainty: Heuristics and Biases" is around 600 pages long. Becoming knowledgeable about rationality is an investment. The Sequences would take in the ballpark of 80 hours to read at the average reading speed. Becoming knowledgeable about this specific subculture is an investment.

There is no holy book, authority, set of political agendas, or set of popular beliefs that we can point you to in order to tell you which beliefs rationalists have. It would not be in the best interest of a rationalist to cling to beliefs by defining themselves with a set of specific beliefs. Instead, we can point you to various methods we might use for choosing beliefs like Bayesianism.Bayesianism. Using Bayesian probabilities is considered by many in this subculture to be one of the most fundamental and most prominent parts of the reasoning toolbox.

1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.progress.

1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs.beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.

1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct.correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.

The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls involved in learning about cognitive biases (http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/). Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.

The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of [pitfallspitfalls involved in learning about cognitive biases](http:biases (http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/). Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.

The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls[pitfalls involved in learning about cognitive biases.biases](http://lesswrong.com/lw/he/knowing_about_biases_can_hurt_people/). Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.

The Purpose of the Introduction to LessWrong

We don't want new users to be downvoted a lot and most of you don't like it, either. This guide was created to help new users quickly get the gist of what LessWrong subculture is about and how website participation works to help new users gain some orientation. If you came here on your own, that's excellent because if you attempt to participate on the website without all the information in this introduction, there's a pretty good chance that you'll be pretty lost.

What is LessWrong?

LessWrong refers to a website for a specific rationalist subculture. The main website feature is a blog. In addition to hosting user-generated posts and articles, the LessWrong blog also hosts LessWrong's main collection of writings. This collection, called "The Sequences", is about rationality and was written by a variety of authors. The term "LessWrong" is also used as a term to describe the related IRL Meetups ("LessWrong Meetups").

What are The Sequences?

The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls involved in learning about cognitive biases. Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.

A Gist of the LessWrong Rationalist Subculture

The main document that has influenced LessWrong subculture is The Sequences. The main theme of The Sequences may be rationality but there are many other themes in The Sequences which have influenced the subculture as well. These other themes may be why the subculture has attracted a disproportionate number of software engineers, math and science oriented individuals, people with an interest in artificial intelligence, atheists, etc. More importantly, The Sequences also contain a lot of articles with Eliezer's ideas about rationalist culture. If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you'll encounter:

Imagine being transported to a different country without ever having heard of that country before. You would have little hope of success in that society without first learning about the many differences between your cultures. That...

Read More (1465 more words)