The Survival of Humanity, by Lawrence Rifkin (September 13, 2013). Some excerpts:

An existential catastrophe would obliterate or severely limit the existence of all future humanity.

As defined by Nick Bostrom at Oxford University, an existential catastrophe is one which extinguishes Earth-originating intelligent life or permanently destroys a substantial part of its potential.  As such it must be considered a harm of unfathomable magnitude, far beyond tragedy affecting those alive at the time. Because such risks jeopardize the entire future of humankind and conscious life, even relatively small probabilities, especially when seen statistically over a long period of time, may become significant in the extreme. It would follow that if such risks are non-trivial, the importance of existential catastrophes dramatically eclipse most of the social and political issues that commonly ignite our passions and tend to get our blood boiling today. [...]

One would think that if we are mobilized to fight for issues that affect a relatively small number of people, we would have an even stronger moral and social emotional motivation to prevent potential catastrophes that could kill or incapacitate the entire human population. But there are significant psychological barriers to overcome. People who would be emotionally crushed just hearing about a tortured child or animal may not register even the slightest emotional response when contemplating the idea that all human life may one day become extinct. As Eliezer Yudkowsky wrote, “The challenge of existential risks to rationality is that, the catastrophes being so huge, people snap into a different mode of thinking.” [...]

Here is a partial list of suggestion worthy of consideration. The idea here is not to advocate for some extreme survivalist or “Chicken Little” mentality, but rather to use reason, foresight, and judgment about how best to protect our future.

  • Create a larger worldwide stockpile of grains and other food reserves.
  • Support and prioritize global measures to detect, prevent, and halt emerging pandemic infectious diseases, such as the WHO’s The Global Outbreak Alert and Response Network.
  • Invest in technologies to discover and deflect large asteroids and comets on a deadly collision course with our planet.
  • Consider banning the synthesis and public publication of the genome sequences of deadly microorganisms such as smallpox and the 1918 influenza virus, thereby reducing the risks of bioterrorism or accidental release.
  • Maintain stores in multiple locations of wild plant species, seed banks, and gene banks to safeguard genetic diversity.
  • Invest in space station research. Because of the Sun’s ultimate expansion heating up the planet, Earth will become uninhabitable for humans in about 1-1.5 billion years (it will become uninhabitable for all life on Earth several billion years after that). This is, understandably, almost too long from now to contemplate. Nonetheless, our best (and possibly only) chance for survival in the very distant future may be to live in space or to colonize other planets or moons.
  • Create strains of agricultural species better able to withstand major environmental change and threats.
  • Continue to strive towards scientific accuracy in predicting climate change effects, and work towards renewable energy sources, sustainable use, technological solutions, and other measures to prevent potential climate catastrophes. Human-caused environmental changes that increase the risk of global pandemics deserve particular attention.
  • Develop appropriate oversight of new molecular manufacturing technologies.
  • Prioritize international cooperation to reduce nuclear proliferation, secure existing nuclear weapons, develop systems to minimize technological mishaps, and decrease the world’s nuclear armamentarium.
  • Maintain a well-chosen small number of people in a deep, well protected refugee sanctuary, with adequate supplies to last for years to buffer against human extinction from a range of causes. Genetically diverse international volunteers who live in such a bunker could be rotated, say, every two months. A similar Noah’s ark refuge could be established on a space station.
  • Work towards changing the social conditions that foster ideological absolutism.
  • Promote evidence-based thinking and reason at all levels of society.
  • Plan in detail to quickly produce and administer vaccines and other medical interventions during a pandemic.
The idea is not that we should do all these, but that the issue deserves our very highest consideration. 

New to LessWrong?

New Comment
3 comments, sorted by Click to highlight new comments since: Today at 9:59 PM

Most of those (8/14) bear little to no import on actual short- or medium-term x-risks.

[-][anonymous]11y20

Global food reserves are currently under one year worth (according to a UN report). As a result, if some kind of plague wipes out grain yields (or some other major food source) for this year we would be looking at a massive die off or a collapse of modern society.

Infectious diseases are a pretty obvious x-risk.

Asteroids are also pretty obvious x-risks.

As is open-sourced bio-terrorism.

Genetic diversity of wild species, granted, isn't going to cause human extinction in the short or medium term.

For space stations and other off world sustainable settlements, see the above threats of asteroids and plagues, as well as the not putting all your eggs in one basket concept. The expansion of the sun is not a near term threat, but multiple colonies is a pretty blanket x-threat protection.

Protecting food production from environmental change and environmental threats. See above regarding our current food stocks and vulnerabilities to damage to this industry.

Climate changed caused pandemics due to the migration of pathogens to areas without prepared immune systems. Sounds like mapping these movements is an important way of reducing a near term x-risk.

Nano-tech. I'm not sure how you would define short- or medium-term, but I suspect "within the next few decades" counts. Having legislation in place before nano-robotics takes of seems like an important near-term step to take.

Nuclear weapons. The ability of multiple competing nations to nuke each other back to the stone age, held in check only by public opinion and a mode of thought so insane even the acronym is MAD. Nuclear weapons count as an x-threat (IMHO).

For a protected and isolated colony, see my notes on space colonies above. A large number of threats can be prevented by having a backup population.

Ideological absolutism; aka the kind of mentality that leads to terrorism or otherwise thinking "everyone who X must die, at any collateral cost".

Evidence based thinking is another one of those general things that make new x-risks less likely to emerge and more likely to be discovered before it is too late, but I'll agree that the main effects would be other benefits.

Pandemics again.

I got 12-13 with clear relevance, so clearly we disagree about some of them. Which of the above would you not count as an x-threat?

I'm not positive which ones you mean, but if you mean things like famine prevention and climate change, then I think I disagree with you. I think that anything that triggers the collapse of civilization does constitute an extensional risk, because I think there is a significant chance that in a deep and total civilization collapse (on the scale of the fall of the Roman Empire, say) that humanity would never recover.

People in Europe survived the collapse of the Roman Empire because they were able to fall back on older iron-age technologies, which were good enough to keep a decent percentage of the people alive. But we are much farther away from those now, practical knowledge of how to survive without modern technology is much more rare, the environment is much more degraded, population would have to fall much faster, and in any case a collapse that large and that broad would have a significant risk of leading to a nuclear war or other catastrophe. I would say that there is at least a 10%-20% chance that a total collapse of civilization would lead to either human extinction or to humans never again reaching the current level of technology.