Posts

Sorted by New

Wiki Contributions

Comments

Are astronomical suffering risks (s-risk) considered a subset of existential risks (x-risk) because they "drastically curtail humanity’s potential"? Or is this concern not taken into account for this research program?

Very interested in this, especially looking out for how to balance or resolve trade-offs between high inner coordination (people agree fast and completely on actions and/or beliefs) and high "outer" coordination (with reality, i.e. converging fast and strongly on the right things), aka how to avoid echo-chambers/groupthink without devolving into bickering and splintering into factions.