There is a pattern that it seems to me rationalist-type people are prone to:
Internalize a belief X widely endorsed by society, which has many implications for actions.
Notice that people don't actually do the actions that are implied by X.
Do actions implied by X really hard, including trying to get the many X-endorsers to see the errors of their ways and join you. Usually, society is not super receptive to this.
Examples:
Men's rights activism: society says we should treat the genders the same, and when one gender is treated worse, we should do activism to fix it. There are various ways in which men are treated worse, and people largely don't care about them, so we should do activism to fix this.
James Damore: my company says it wants to recruit more women, but their strategy overlooks psychological differences between men and women. I should make a memo to help them improve their strategy.
Various religious reform groups: religious authorities say we should be pious and stuff, but mostly use religion for their own socio-political purposes. We should do religion by actually following the holy book.
Effective altruism: society says every human has the same moral worth and we should help those in need, but organizations that claim to do this work are mostly doing other stuff, and people donate to them anyway. We should find the best ways to help arbitrary people per dollar so people can donate to them instead.
I think this is error-prone because in step 2 of the journey, we found out barely anyone actually believes X, and people just say it for whatever reason. Of course, that doesn't automatically mean X is false. But when looking at self-justifications by groups that fit this pattern, they mostly lean on the wide endorsement of the belief and basically assume it as background.
But this won't convince those who are not convinced, and most people who verbally agree with X are already used to living with the contradiction. You also shouldn't lean on the wide endorsement for yourself - I guess it's some positive evidence, but considering the wide dis-endorsement in practice, it's rather weak to negative on net.
Instead, I advise that before proceeding to step 3, you independently derive the belief X - for yourself and for others you want to convince, which turns out to be almost everyone.
There is a pattern that it seems to me rationalist-type people are prone to:
Examples:
I think this is error-prone because in step 2 of the journey, we found out barely anyone actually believes X, and people just say it for whatever reason. Of course, that doesn't automatically mean X is false. But when looking at self-justifications by groups that fit this pattern, they mostly lean on the wide endorsement of the belief and basically assume it as background.
But this won't convince those who are not convinced, and most people who verbally agree with X are already used to living with the contradiction. You also shouldn't lean on the wide endorsement for yourself - I guess it's some positive evidence, but considering the wide dis-endorsement in practice, it's rather weak to negative on net.
Instead, I advise that before proceeding to step 3, you independently derive the belief X - for yourself and for others you want to convince, which turns out to be almost everyone.