Context for Draft / Request for Feedback
The LessWrong team is hoping to soon display a new About/Welcome page which does an improved job of conveying what LessWrong.com is about and how community members can productively use the site.
However, LessWrong is a community site and I (plus the team) feel it's not appropriately for us to unilaterally declare what LessWrong is about. So here's our in-progress draft of a new About/Welcome page. Please let us know what you think in the comments. Please especially let us know if you think LessWrong is actually about something else. Or even just what it means to you.
LessWrong is a community blog devoted to the art of human rationality.
We invite you to use this site for any number of reasons, including, but not limited to: learning valuable things, being entertained, sharing and getting feedback on your ideas, and participating in a community you like. However, fundamentally, this site is designed for two main uses:
- As a place to level-up your rationality
- As a place to apply your rationality to important real-world problems
Primary things to do on LessWrong are:
- Read LessWrong’s repository of rationality materials
- Join a local rationality meetup
- Join in a discussion
- Ask or answer a question
- Write a post
Leveling up your rationality
First off, what is rationality?
Rationality is a term which can have different connotations to different people. On LessWrong, we mean something like the following:
- Rationality is thinking in ways which systematically arrive at truth.
- Rationality is thinking in ways which cause you to achieve your goals.
- Rationality is trying to do better on purpose.
- Rationality is reasoning well even in the face of massive uncertainty.
- Rationality is making good decisions even when it’s hard.
- Rationality is being self-aware, understanding how your own mind works, and applying this knowledge to thinking better.
What rationality is not:
- Forsaking all human emotion and intuition to embrace Cold Hard Logic.
Why should I care about rationality?
One reason to care about rationality is because you intrinsically care about having true beliefs. You might also care about rationality because you care about anything at all. Our ability to achieve our goals depends on 1) our ability to ability to understand and predict the world, 2) having the skills to make good plans, and 3) having the self-knowledge and self-mastery to avoid falling into common pitfalls of human thinking. These are core topics in rationality are of interest to anyone with non-trivial goals, from curing their persistent insomnia and having fulfilling relationships to performing groundbreaking research or curing the world’s greatest ills.
See also Why truth? And...
How does LessWrong help me level up my rationality?
A repository of rationality knowledge
LessWrong has an extensive Library containing hundreds of essays on rationality topics. You can get started on the Library page or from the homepage. Among the newer material, we particularly recommend Curated posts.
The writings of Eliezer Yudkowsky and Scott Alexander comprise the core readings of LessWrong. As part of the founding of LessWrong, Eliezer Yudkowsky wrote a long series of blog posts, originally known as The Sequences and more recently compiled into an edited volume, Rationality: AI to Zombies.
Rationality: From AI to Zombies is a deep exploration of how humans minds can come to understand the world they exist in - and all reasons they so often fail to do so. The comprehensive work:
- lays foundational conceptions of belief, evidence, and understanding
- reviews the systematic biases and common excuses which cause us to believe false things
- offers guidance on how to change our minds and how to use language effectively to describe the world
- depicts the nature of human psychology with reference to how evolution produced us
- clarifies the kind of morality humans like us can have in a reducible, physical world
- and repeatedly reminds us that confusion and mystery exist only in our minds.
Eliezer covers these topics and many more through allegory, anecdote, and scientific theory. He tests these ideas by applying them to debates in artificial intelligence (AI), physics, metaethics, and consciousness.
Eliezer also wrote Harry Potter and the Methods of Rationality (HPMOR), an alternative universe version of Harry Potter where Harry’s adoptive parents raised with Enlightenment ideals and the experimental spirits. This work introduces many of the ideas from Rationality: A-Z in a gripping narrative.
Scott Alexander’s essays on how good reasoning works, how to learn from the institution of science, and the different ways society has been and could be organized have been made into a collection called The Codex. The Codex contains such exemplary essays as:
- Beware Isolated Demands for Rigor
- The noncentral fallacy - the worst argument in the world?
- The Categories Were Made For Man, Not Man For The Categories
- I Can Tolerate Anything Except the Outgroup
Members on LessWrong rely on many of the ideas from their writers in their own posts, and so it's advised to read at least a little of these authors to get up to speed on LessWrong's background knowledge and culture.
Truth-seeking norms and culture
We are proud of the LessWrong community not just for its study of rationality, but also for how much these ideals and skills are put into practice. Unlike many social spaces on the modern Internet, LessWrong is a place where changing your mind, charitability, scholarship, and many other virtues are cherished. LessWrong helps you improve you rationality by providing a space where healthy epistemic and conversational norms are encouraged and enforced.
Social support and reinforcement
Beyond culture and norms, it’s easier to learn, change, and grow when you’re not alone on your path. Find solidarity on your quest for greater rationality with the LessWrong community. You can participate in the conversations online (via the comments or writing posts which build on the posts of others). Or attend a local in-person meetup, conference, or community celebration. In the last twelve months, there have been 461 meetups in 32 countries.
Opportunities to practice your rationality.
See the next section.
Applying your rationality to important problems
Feedback and practice are crucial for mastery of skills. If you’re not using your skills to do anything real, how do you even know whether you’re on the right track? For this reason, LessWrong is a place where rationality is both trained and put to use.
Plus, it’s nice to accomplish real things.
Ways to apply your rationality on LessWrong
Participate in discussions aimed at truth-seeking and self-improvement
On LessWrong, you can converse with others with the real goal of exchanging beliefs and converging on the truth. You can delight in dialog which isn’t about Being Right, but actually in clarifying the matter at hand. And you can work together with others, each of you providing your own understanding and background knowledge to figure out how reality really is. This is not Internet discussion as you know it.
While rationality, self-improvement, and AI are the most frequently discussed topics on the site, there are also commonly discussions of self-improvement, psychology, philosophy, decision theory, mathematics, computer science, physics, biology, history, sociology, meditation, and many other topics.
Core to LessWrong is that we want our online conversations to be productive, constructive, and oriented around determining what is true. Our Frontpage commenting guidelines ask members to:
Aim to explain, not persuade. Write your true reasons for believing something, not what you think is most likely to persuade others. Try to offer concrete models, make predictions, and note what would change your mind.
Present your own perspective. Make personal statements instead of statements that try to represent a group consensus (“I think X is wrong” vs. “X is generally frowned upon”). Avoid stereotypical arguments that will cause others to round you off to someone else they’ve encountered before. Tell people how you think about a topic, instead of repeating someone else’s arguments (e.g. “But Nick Bostrom says…”).
Get curious. If I disagree with someone, what might they be thinking; what are the moving parts of their beliefs? What model do I think they are running? Ask yourself - what about this topic do I not understand? What evidence could I get, or what evidence do I already have?
Once you’ve read some of LessWrong’s core material and read through some past comment-section discussions to get a sense of how we communicate around here, you’re ready to participate in a LessWrong discussion.
Post your valuable ideas
Our collective knowledge and skills are solidified by members writing posts. By writing posts, you benefit the world by sharing your knowledge and benefit yourself by getting feedback from an audience. Our audience will hold you to high standards of reasoning, yet in a cooperative and encouraging manner.
Posts on practically any topic are welcomed. We think it's important that members can “bring their entire selves” to LessWrong and are able to share their thoughts, ideas, and experiences without fearing whether they are “on topic”. Rationality is not restricted to only specific domains in one’s life, and neither should LessWrong be.
However, to maintain its overall focus, LessWrong classifies posts as either Personal blogposts or as Frontpage posts. The latter have more visibility by default on the site.
All posts begin as personal blogposts. Authors can grant permission to LessWrong’s moderation team to give a post Frontpage status if it i) has broad relevance to LessWrong’s members, ii) is timeless, i.e. not tied to current events, and iii) primarily attempts to explain rather than persuade.
The not-perfectly-named category of “Personal” blogposts are suitable for everything which doesn't fit in Frontpage. It’s the right classification for discussions of niche topics, personal interests, current events, community concerns, potentially divisive topics, and just about anything else you want to write about.
Contribution on LessWrong’s Open Questions research platform.
Open Questions was built to help apply the LessWrong community’s rationality/epistemic to humanity’s most important problems.