When it comes to announcing Meetup's to people who haven't heard of LessWrong before, there's the need to explain what we are about. I want to invite everyone to share his own one paragraph explanation of what LessWrong is about from their perspective in the comments.

6 comments, sorted by Click to highlight new comments since: Today at 8:58 AM
New Comment

LessWrong consists of people who like to think deeply about what the world is like, and how we can understand it better; what goals we should have and how we can change ourselves to achieve them; and what goals humanity should have, and how to build an AI that helps humanity achieve them.

LessWrong is a movement that seriously tries to better the world by a significant margin, not shying away from the most unconventional strategies. Most notably, we believe in the prime importance of securing AI Safety, and we subscribe to the values of transhumanism. Knowing that nature is not a fair enemy, we put in a great effort to grow as individuals and as a community, hoping to gather enough strength to live up to the task. We do this in various ways, applying epistemic standards at least as rigorous as that of science, thinking hard about late advances in philosophy and how to put it's lessons into practice, while keeping an open mind to the benefits of subjective wisdom like spirituality and our intuitions.

LessWrong is about refining human rationality. It primarily deals with 1. theoretical turthseeking, 2. practical methods for achieving more of what you want, and 3. the problem of AI alignment.

Who’s “we”?

The “we” that is “Less Wrong dot com, and the people who post and comment there” and the “we” that is “the people organizing and attending this local meetup group” are surely not the same “we”, nor is there any good reason to expect any one explanation to capture the essences of both of these sets.

I'm happy about any "we". that the author of a comment finds appropriate.

When asking questions with the intent to stimulate discussions it's useful to choose a level of specificity that allows plenty of different answers and not narrow down the field of possible answers unnecessarily.

New to LessWrong?