LESSWRONG
LW

1217
Wikitags

Cause Prioritization

Edited by Multicore, Yoav Ravid last updated 17th Jan 2021

Cause Prioritization is the process of researching which charitable causes offer the most benefit for the marginal investment. Priorities can shift as existing causes reach funding and hiring goals, and new opportunities to do good are discovered. Cause prioritization is an important part of Effective Altruism.

See also: Cause Prioritization Wiki

Subscribe
Discussion
1
Subscribe
Discussion
1
Posts tagged Cause Prioritization
209Efficient Charity: Do Unto Others...
Scott Alexander
15y
322
170Short Timelines Don't Devalue Long Horizon Research
Ω
Vladimir_Nesov
5mo
Ω
24
65Further discussion of CFAR’s focus on AI safety, and the good things folks wanted from “cause neutrality”
AnnaSalamon
9y
38
59Why CFAR's Mission?
AnnaSalamon
10y
56
49Prioritization Research for Advancing Wisdom and Intelligence
ozziegooen
4y
8
42Why I'm Skeptical About Unproven Causes (And You Should Be Too)
Peter Wildeford
12y
98
39Cause Awareness as a Factor against Cause Neutrality
Darmani
7y
4
31S-risks: Why they are the worst existential risks, and how to prevent them
Kaj_Sotala
8y
106
29What are the reasons to *not* consider reducing AI-Xrisk the highest priority cause?
Q
David Scott Krueger (formerly: capybaralet)
6y
Q
27
28Why SENS makes sense
emanuele ascani
6y
2
9What's the best ratio for Africans to starve compared to Ukrainians not dying in the war?
Q
ChristianKl
4y
Q
28
6Differential knowledge interconnection
Roman Leventov
1y
0
131On Doing the Improbable
Eliezer Yudkowsky
7y
36
75Caring less
eukaryote
8y
24
60A case for donating to AI risk reduction (including if you work in AI)
tlevin
9mo
2
Load More (15/59)
Add Posts