people look into universal moral frameworks like utilitarianism and EA because they lack self-confidence to take a subjective personal point of view. They need to support themselves with an "objective" system to feel confident that they are doing the correct thing. They look for external validation.
I don't consider myself a utilitarian anymore but back when I did, this wasn't a good description of my motivations. Rather it felt like the opposite, that utilitarianism was the thing that made the most internal sense to me and I had a strong conviction in it, and would often strongly argue for it when most other people disagreed with it.
im just going to say a few things without thinking much about them
I believe that a natural healthy reaction to shoulds is to flinch away (should signal something going wrong, something you think you need to do but you don't actually want to) and lack of it signals either strong tendency to take things very literally and a strong sense of purpose or idk like how long one can go at it? it's literally painful why keep doing it, what's the reason to follow shoulds until you are depressed? why does one get stuck looking at the world through stiff binary lens of good and bad? This is only one way to relate to the world. Why keep doing this of not due to wanting to overwrite your own free will?
If EA was solely striving for peak rational, empirically backed altruism, then it is very odd that it doesn't focus massively on attacking gender equality as a main priority.
Gender inequality is a very-well studied, very old issue. It is a huge bottleneck and/or main driver of major challenges for humanity, such as global warming, AI safety, nuclear arms proliferation, POVERTY, and falling/stagnating levels of education and scientific progress. (I do feel like I preach to the choir with all the research backing these claims here but not everyone has got the memo.)
EA is a culture as well. It follows tribal laws. Objectivity is the methodology, not the movement. To not acknowledge that is irrational. Only question is by what degree tribalism influences the altruists.
i didn't upvote or react in any way because I don't understand how gender inequality is related to those issues unless you mean things such as "if more woman were in government it would surely be better for all of us" which I somewhat agree but also I don't think this sentence can be true in the same way give well cost effectiveness estimates can be
EA was never an objectively right way to do charity. That's the point. I will elaborate.
The Gates foundations is perhaps the most famous EA oriented charity to ever to exist. It is largely independent of the global EA movement.
After several years doing research, Melinda Gates wrote a book about how she had discovered that most of their projects were directly tied to gender equality. To do their work effectively, they needed to overcome female disempowerment in particular .
I read the book and smiled while half-rolling my eyes along with most of EU, where each of her discoveries has been known for at least 50 years. UN, universites and the world bank already published such findings, but in USA it was news somehow. (Politics).
You would think this example would sway at least big chunks of EA worldwide. It did not, afaik. That means EA didn't really look beyond the probability space of EA solutions that were already in line with the local culture. If anyone did, it was not supported by 'the tribe'.
Sic semper erat et sic semper erit.
This is not a big critique of EA. All such groups are shaped by their originating culture(s). But it at least shows that leaning on utilitarianism isn't enough to have a rational charity that's self improving.
okay I understand why ea are disinterested in this (my best guess is that it has a sjw vibe while EA is like beyond that and "rational" and focuses on measurable things) but maybe I will spend some time reading about this where do I start. What kind of mindset would be the best to approach it given that I'm a hardcore ea who keeps a dashboard with measurable goals like "number of people of convinced to become EA" and thinks anything that doesnt contribute to the goals on his dashboard are useless.
Huh. You are kind of proving my point here and you don't even seem to realize it. Alright, I will answer.
Don't guess, do research and question your bias. There is significant hard data on this. Gates foundation was focued on stuff like buying mosquito nets (an EA classic) and vaccinations, but somehow several of these efforts just failed. They tried to figure out why.
SJW terminology and 'vibes' is stuff intelligent people don't really bother with much where I live. We are not living in propaganda bubbles in EU as much as some other places.
Approach this with an empirical mindset, but don't expect engineering journals to report on poverty and education issues. I recommend starting with reading what world bank says about gender equality and poverty. I can make a post with extensive sources if you like.
I'm obsessed with planning and organizing my life, and I also tend to overthink and analyze things. Goals are a fundamental piece of organization for me. I try to make a substantial part of my life focused on achieving goals: I work out to keep my body healthy, and I work to earn money and feel secure. But I often feel anxious, and I ask myself if there is any other way of organizing life that avoids the concept of goals altogether. I also think it's useful to imagine living a life without some crucial concept.
It seems that it's quite hard to avoid thinking about goals in general when you define goals as anything that you plan and decide to pursue. It might be possible with some activities when you just try to follow your curiosity and do not think about the long-term effects of your actions. Does art need goals? But then it seems that following curiosity just becomes your next goal.
Looks like a pretty good alternative, thanks! But I just realized that goals actually have some properties that I care about that themes don't have - they really narrow your focus.
There is a bunch of news/articles on the internet which are describing "Elon Musk's rules for productivity" I don't know if Elon Musk really wrote them, but it's not the point. One of the rules usually goes like that:
6) Use common senseIf a company rule doesn’t:
- Make sense
- Contribute to progress
- Apply to your specific situation
Avoid following the rule with your eyes closed.
I really don't agree with it. I think that Rules are usually put into place for some very specific reason that might be hard for us to see, but it is there nevertheless. I'm a software developer and I think that if I listened to some of my colleagues telling me about rules like "don't try to optimize it when you are still figuring out what you actually want to do" I would be a much better developer right now, but I usually didn't and spend a lot of time figuring how to optimize things that didn't really need it.