We don't want new users to be downvoted a lot and most of you don't like it, either. This guide was created to help new users quickly get the gist of what LessWrong subculture is about and how website participation works to help new users gain some orientation. If you came here on your own, that's excellent because if you attempt to participate on the website without all the information in this introduction, there's a pretty good chance that you'll be pretty lost.
LessWrong refers to a website for a specific rationalist subculture. The main website feature is a blog. In addition to hosting user-generated posts and articles, the LessWrong blog also hosts LessWrong's main collection of writings. This collection, called "The Sequences", is about rationality and was written by a variety of authors. The term "LessWrong" is also used as a term to describe the related IRL Meetups ("LessWrong Meetups").
The main document that has influenced LessWrong subculture is called "The Sequences", written by a variety of authors but mostly by Eliezer Yudkowsky. The main theme of The Sequences is rationality. The Sequences reflect a lot of the research done on reasoning mistakes (like cognitive biases) by people such as Daniel Kahneman (a prominent cognitive bias researcher). The two main differences between The Sequences and Daniel Kahneman's work are that Eliezer has a very engaging writing style, while Kahneman is notoriously dry, and Eliezer has taken care to warn readers about a variety of pitfalls involved in learning about cognitive biases. Other themes in The Sequences include artificial intelligence, software engineering, math, science, and atheism.
The main document that has influenced LessWrong subculture is the Sequences. The main theme of the Sequences may be rationality but there are many other themes in the Sequences which have influenced the subculture as well. These other themes may be why the subculture has attracted a disproportionate number of software engineers, math and science oriented individuals, people with an interest in artificial intelligence, atheists, etc. More importantly, the Sequences also contain a lot of articles with Eliezer's ideas about rationalist culture. If you have no familiarity with the cultural articles and other themes before you begin interacting, your social experiences are likely to be highly awkward. The rationalist way of thinking and subculture is extremely, extremely complex. To give you a gist of how complex it is and what kind of complexity you'll encounter:
Imagine being transported to a different country without ever having heard of that country before. You would have little hope of success in that society without first learning about the many differences between your cultures. That is how different LessWrong culture is from the mainstream culture. It's not just a little bit different like so many other subcultures where one can mingle undetected by making a commitment to a few specific cultural beliefs, wearing certain clothes and using a few selections of subculture-specific verbiage. Instead of adopting a specific group of beliefs, rationalists have taken it quite a bit further and have adopted a different way of choosing beliefs. Instead of focusing on dress, they have focused on learning. Instead of using a few dozen subculture-specific terms, there are hundreds and hundreds of vocabulary words. There are a few things you should know about what this fundamentally different way results in so that you will have a grasp of the scope of the difference:
1. Unlike subcultures that form around politically-oriented positions, rationalists are wary of making commitments to beliefs. If one wants to be rational, one should accept an idea only because there is good evidence that the idea is likely to be true, not because one had previously chosen to be "on" a certain "side". Unlike people in religious groups, rationalists do not accept ideas on faith, even if they are presented by an authority figure. Instead, they learn to consider the specific supports for each idea and determine which ones are most likely to be correct. Unlike many people in the mainstream, rationalists are wary of conforming to beliefs merely because other rationalists promote the beliefs. There is no body of knowledge that rationalists cling to and defend as if "Guarding the Truth". Instead, rationalist subculture is about discovering and making progress.
There is no holy book, authority, set of political agendas, or set of popular beliefs that we can point you to in order to tell you which beliefs rationalists have. It would not be in the best interest of a rationalist to cling to beliefs by defining themselves with a set of specific beliefs. Instead, we can point you to various methods we might use for choosing beliefs like Bayesianism. Using Bayesian probabilities is considered by many in this subculture to be one of the most fundamental and most prominent parts of the reasoning toolbox.
2. The amount of difference between this culture and mainstream culture cannot be expressed well by presenting a short list of differences. This is because when a group's main difference consists of a different way of choosing beliefs, this results in the group choosing a very large number of things that differ from the mainstream, not just a short list. This different way of choosing beliefs is involved. There are over a hundred cognitive biases that humans are affected by that rationalists aim to avoid. Imagine you added over one hundred improvements to your way of thinking. How many future situations would you make a different choice in? Imagine at least that many differences when you think about what rationalists are like.
3. It takes a very long time to become good at being rational. To be a really good reasoner, you need to patch over a hundred of cognitive biases. Rationalists improve their rationality because it's necessary if you want to make good decisions. Good decisions are, of course, necessary if you want a high degree of success in life and learning about biases is necessary just to help you avoid self-destructive decisions. Becoming more rational requires an investment. There is no quick fix. There is a lot to learn. Until you've invested a lot into learning, many of the people you'll encounter in the subculture will know a lot more about this than you do. Interacting with this subculture isn't like talking about a couple dozen bands and enjoying the same music. Daniel Kahneman's book "Judgment under Uncertainty: Heuristics and Biases" is around 600 pages long. Becoming knowledgeable about rationality is an investment. The Sequences would take in the ballpark of 80 hours to read at the average reading speed. Becoming knowledgeable about this specific subculture is an investment.
To resist Dunning–Kruger effect (mistakenly believing you know more about a subject than you do, possibly because you simply weren't clued in to how vast the subject is), and to make the depth and breadth of this subculture seem more real to you, you could begin by browsing the titles of articles in the Sequences. That's the closest thing there currently is to an index of the subculture. That can be found here: Sequences
On the main website feature, the community blog, there are two areas. One area is called "posts" or "discussions" while the other area is called "articles" or "main". Don't be fooled by the casual-sounding titles "posts" / "discussions". Members do not treat the posts/discussions area as a casual place for chatting or as a message board. The posts/discussions area is treated more like a community blog. The social norms are:
1. Either write something deemed useful, or go to the open thread.
Many members want to keep up with all of the posts/discussion submissions as well as all of the articles/main submissions, as if keeping up with the news. For this reason, they experience it as inefficient when there are submissions about minor details, off-topic submissions, submissions on topics that have already been covered, and meta threads (submissions about posting, about the website, about the subculture, etc.). If you want to converse about any of those things, find the most recent version of a post labeled "Open Thread" to put them in. Somebody, often a person using the handle "Open Thread Guy" makes new open threads in posts/discussions periodically.
2. Meet the quality standard norms in both posts/discussions and articles/main.
Many members expect all submissions in posts/discussions and articles/main to be well-written and they have very high standards for this. In addition to desiring good spelling and grammar, they also like to see that you've done your homework. They like to see references, mathematical equations, graphs, vocabulary terms and want you to show familiarity with the subculture. They really do not like seeing authors make mistakes that seem to pattern match to errors like cognitive biases, logical fallacies or other errors. The standards for articles/main are higher than the standards for posts/discussions but standards for posts/discussions are still significantly higher than the standards you typically see on the Internet for message board posts.
3. Write something of quality, even when commenting.
Members have high standards for comments as well, behaving as if they want the entire page, comments included, to be full of new information that is well-presented, well-reasoned, correctly spelled, etc. The standard for posts/discussions is higher than the standard for comments, but the standard for comments is still significantly higher than the standards you typically see on the Internet for message board comments.
4. Your professional face will probably fare better than your casual face.
LessWrong members do not treat the website as a fun hangout, a joke site, or an emotional support forum. To blend in on the website, the best thing you can do is to behave more or less the way you would for a professional endeavor. Expect to do some learning before the others will accept you. Brush up on cognitive biases and logical fallacies or you will quickly be viewed as "irrational". Read about the subculture so that you can anticipate the way that people will interpret your words, how they will react to your ideas and so that you can work with these interpretations and reactions intelligently. One exception is that anonymous Internet handles are viewed as perfectly acceptable.
5. Don't expect people to be perfect rationalists, not even yourself.
Above all, remember that nobody is a perfect rationalist. You're going to make mistakes, and you're going to find reasoning errors that other members have made. You may not be able to fix other people's irrationality, but you can keep an eye out for your own mistakes. None of us were taught to think rationally in school, and we've all been steeped in beliefs that were grown, defended, selected and mutated by countless irrational decision-makers. Becoming a group of perfect rationalists would take a very long time and may not be a realistic goal. Our common goal is to refine ourselves to become less and less wrong by working together. If you get the urge to tear someone's reputation to little bits, please remember this: We've all inherited quite the ideological mess, and we're all working on this mess together. Don't expect others to be perfect rationalists. They can't be perfect but most of them do desire to be more rational.
6. Don't help us be less wrong too much.
Although it can be, for a variety of reasons, extremely tempting to go around telling people that they're wrong or starting debates, you should be aware that this behavior is likely to be interpreted as status seeking. Many members frown on social status games. Maybe you feel motivated by some form of altruism along the lines of Randall Monroe's call to "duty" to step in because "Someone is wrong on the Internet." and you want them to be right. Maybe you really do enjoy showing off while making other people feel publicly humiliated. Regardless of whether your motives are altruistic, selfish or otherwise, please be aware that behaviors that seem similar to these are likely to be perceived as part of a social status game, an attack or trolling. LessWrong members are of course interested in learning from their mistakes, but they're also human. If you say things that could insult them, many will feel and/or behave the way that insulted humans do. Simply put: this is one of the fastest ways to make yourself unpopular. If you want to increase your status, consider this research instead: Political Skills which Increase Income