I generally approve of the apparent move away from the "existential risks" terminology and towards "global catastrophic risks".
I'm not sure I endorse it as a terminological shift, insofar as catastrophic risks and existential risks are different things and using one label to refer to the other entity creates a lot of confusion. But I definitely endorse being concerned with catastrophic risks rather than solely with existential risks, and endorse properly labeling the object of my concern.
The idea that "existential risks" is an abbreviation for "global existential risks" just seems awful to me.
From your link, the top 5 threats identified by Skoll are:
Climate change
Water security
Pandemics
Nuclear proliferation
Middle East conflict
Singling out the last one is silly, as it is only globally dangerous as a trigger of nuclear proliferation. Pandemics are nothing new, water security is at most a limiting factor of further population explosion. The consequences of the global climate change are still unclear, as this is a topic of the current research, so any meaningful action is unlikely.
Thus the only truly global risk worth addressing at this time is nuclear proliferation.
Just because pandemics aren't new doesn't mean we shouldn't address them. Especially since we seem to have just been lucky and had no severe pandemics since the 1918 flu.
Inequality is becoming an existential threat.
Please be more specific: What kinds of inequality are how much of a threat?
I think there are defensible ways you could spin this, like:
"Inequality is likely to lead to more distributional conflicts that could metasize into existentially risky conflicts" or
"Unchecked rich countries can take more risks with the rest of the world's population (as with e.g. global warming, or other potentially more existentially risky tradeoffs) that they could not if there were more international parity" or most plausibly (of things I can think of off the top of my head)
"The coexistence of intensive economic growth (in new technologies), extensive economic growth (in greater total population and urban concentration of such), and poverty greatly increase the probability of a global pandemic."
But it seems pretty clear from all the other phrases written down that these are just meant to be applause lights, so maybe this is an exercise in excessive charity. (Not that applause lights might not have their place; for instance, a speech meant to convince layfolk that existential risk is important.)
I have attacked this specific applause light, because it seems to me dangerous with regards to some existential risks.
Imagine for example that a giant meteor is falling on Earth... but we have divided all resources evenly between the 7 000 000 000 inhabitants of the planet, and there is no way to finance a defense against this danger. You would need too many people to agree to put their resources together -- and you can't get enough people to agree. Game over. With more inequality, this specific existential risk could have been avoided.
Sure, for any decision you can invent a "what if" scenario where that specific decision appears wrong. But dividing all resources equally would probably create more problems that it would solve. Starting with: many people would waste their resources soon, so a new redistribution would be necessary every week.
(This is not an argument of right-wing politics, or at least I am trying not to make it. I am perfectly OK with equality, as long as it can work well. Unfortunately, just like with superhuman AI constructions, a random solution is probably very bad. We need to think hard to find a good solution. And I wouldn't trust someone with too many applause lights to do this correctly.)
I agree that there are potential upsides and downsides to most everything. But it does seem facially unlikely that a high degree of equality will ever be achieved without institutions that are very effective at financing public goods (leaving aside the possibility of resources becoming so abundant that wealth becomes meaningless, in which case this particular problem is also solved.)
A high degree of equality could be also achieved by a hypothetical institution powerful enough to redistribute everything, and yet very uneffective at financing public goods.
When everyone has nothing, that too is equality.
More: Skoll Global Threats Fund | To Safeguard Humanity from Global Threats
More on existential risks: wiki.lesswrong.com/wiki/Existential_risk
Organisations
A list of organisations and charities concerned with existential risk research.
Resources