I think at this point I'm in the learning phase where I'm just staring at things in wonder. For my bugs and technique discussions, I have followups with my workshop buddies and that seems to be working pretty well. I think the reason I came to lesswrong was to understand more about the community itself. Who is part of it? How big is it? What is everybody talking about? Those kinds of things. Reading posts every couple of days seems to be working for now.
It might help if there are recent posts where the community is focused inward and talking about itself. I've seen a couple of these, but if there are any good ones that come to mind for you I would appreciate it.
Thank you for being welcoming.
Thank you for this advice. I was going to ask you for specifics, then I felt dumb, then I just started googling, then I found the list:
Something Orwellian, like The Guild of Truth or Reality Club or something. It would need a name that would inoculate new members against thinking they were joining a religion.
I can no longer tell if I'm trying to be ironic or serious, it's a consequence of living in the post-truth era.
I definitely agree with more rationalists having children. Any baby would be lucky to be born to a loving family that values education and helping the world.
The larger concern, however, is misplaced. If we want more rationalism and altruism in society, it isn't a question of genes it's a question of memes. For one, we don't have time for the next generation to solve the biggest risks we face. We have to find a way to make vast swaths of humanity more rational and altruistic in like, the next 50 years. One generation of gene spreading is not going to increase ranks enough to make a difference. Second, intelligence can help rationality and altruism, but it can just as easily hurt those things. I've known some pretty brilliant people who were completely blind to biases and treated people with cruelty.
This is similar to the discussions around AI risk, I think. Will a smart AI necessarily be kind or work in the interests of humanity? Maybe not.
The question should be how we nurture our intellectual children. What does rationality become as a mature system of thought? Will it be accessible and valuable to anybody regardless of their current social identity / location / education, etc.? Can it be easily transmitted or does it require context? These are the parental thoughts we should be having, in addition to finding somebody to love and produce babies with.
If only we could trick everybody into believing they are ruled by a powerful empire / God that can't been seen or proven to exist. The empire inspires good behavior in its citizens by enforcing policies in a way that is anonymous (it might just be randomness or other causes). The empire demands no taxes (you're already protected or "saved" just by being human). The empire rewards people who treat others with love and kindness by magically imbuing them with invincibility / eternal life. (They don't actually have these things, it's just a way to feed the egos and reward good actors in society). The empire punishes war by ensuring that only the side perceived as worthy will prevail. (This distracts from simply building an army by focusing the creative elements of society on arbitrary actions to appease the empire / God. Some of these may coincidentally have a positive influence on society.)
I'm new here, so I'd like to share my rationalist origin story. (Please somebody tell me if I'm doing this in the wrong place.) I only became aware that rationality was a thing very recently. I'm getting started with the sequences and rationalist blogs, but there is a ton to read and it will take me a while. I'm familiar with many of the concepts and I have strong opinions about them, though I realize there's a lot to learn. I am going to try to express my opinions but hold onto them loosely, so PCK can work.
I was introduced to rationality by attending a CFAR workshop in early May. I'm not sure exactly why I signed up. A few people at work had raved about it, but I didn't really understand what it would help me accomplish. For the last year I've been feeling a lot of anxiety about the future of humanity and the possible collapse of society, etc. I've been coping with this anxiety by writing short stories about a moral revolution. I think one of our root problems is that people mostly talk about how things aren't working. I wanted to write about how things might be working perfectly. If there was a specific goal, I went to CFAR for help making more progress at becoming an author.
I found the workshop to be transformative in many ways I won't go into here. It helped me with my writing project as well, but not in the way I expected. My writing is concept-heavy, but I am bad at creating characters. One concept that is important to me is that humanity needs a new kind of philosophy. Something that isn't quite a religion or a scientific theory or an economic model but is something that combines all of those domains. This philosophy would strengthen individuals and give groups in different domains shared values/goals. Rationalism strikes me as this kind of philosophy. The rationalists I observed in the workshop struck me as being stronger because of what they know. Rationalism had changed them in ways I don't yet fully understand. In short, you all make me believe that a moral revolution is possible. You help me imagine the kinds of people who will address humanity's biggest challenges.
This is a really long intro I feel weird about that but I'm going to post it anyway. Rationalism is great. You're all great.
I've been thinking about this post the last couple of days and I had an idea. I'm not sure if it directly addresses the topic, but if I don't write it down then it'll probably be wasted.
Idea: prior to an event like a CFAR workshop (or similar event) create teams of 5 people each. Organize by where they live and what leisure activity they enjoy. Choices are video games, movies, reading, board games, etc. In some rented space with plenty of snacks the 5 participants are asked to spend 4-8 hours in the same space. They can enjoy the leisure activity if they wish. Or, they can talk and just relax. The participants will have a chance to activate their parasympathetic nervous systems and authentically relate. The faces of the other participants will seem familiar and safe. This strengthens bonding in the larger cohort when the event begins the next day, or something.
Assumed context: Modern society puts too much focus on productivity. I have a strong bias about this because my career involves operational analysis. There are always top-down forces pushing to reduce staff by x% by increasing everybody's productivity. This is because cost metrics are clear. We can easily measure how many widgets were produced. Value metrics are often unclear. We can't easily measure the strength of a teams internal relationships (i.e. its social complexity). All of us are often trying to 'do more', and so in social settings we are often guarded. Consequently, strong social bonds are less likely to occur. This isn't just about places of work, it's part of our philosophy. We are all trapped in the attention economy, trying to create things that might make people notice us. (just like what I'm doing now I guess...)