Existential risk

An existential risk is a risk that is both global (affects all of humanity) and terminal (destroys or irreversibly cripples the target). Nick Bostrom defines an existential risk as a risk "where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential."

If an existential risk comes to destroy a civilisation, the entire course of history is changed. If a civilisation was flourishing and spreading wellbeing throughout the universe, then there is tremendous utility in keeping it alive. If a civilisation is causing suffering in all the sentient beings with which it comes in contact, then arguably the opposite is true. Which of these better characterizes humanity is an open question. Even more important to utilitarians is which descriptor will better characterize humans of the future. After all, if there is a catastrophe, it is these future humans that will not be born.

Should we prevent existential risks?
This depends on whether humans and animals have, on balance, a positive future. One concern with preventing existential risk is that post-humans would go on to multiply wild-animal suffering.

Another reason to be concerned about existential risk is be the Astronomical Waste argument: that an enormous amount of future good lives (or experiences) would not come to exist.

Taxonomy
Taken verbatim from Nick Bostrom's "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards" (2002)


 * Bangs – Earth-originating intelligent life goes extinct in relatively sudden disaster resulting from either an accident or a deliberate act of destruction.
 * Crunches – The potential of humankind to develop into posthumanity is permanently thwarted although human life continues in some form.
 * Shrieks – Some form of posthumanity is attained but it is an extremely narrow band of what is possible and desirable.
 * Whimpers – A posthuman civilization arises but evolves in a direction that leads gradually but irrevocably to either the complete disappearance of the things we value or to a state where those things are realized to only a minuscule degree of what could have been achieved.

Bangs

 * Deliberate misuse of nanotechnology
 * Nuclear holocaust
 * We’re living in a simulation and it gets shut down
 * Badly programmed superintelligence
 * Genetically engineered biological agent
 * Accidental misuse of nanotechnology (“gray goo”)
 * Something unforeseen
 * Physics disasters
 * Naturally occurring disease
 * Asteroid or comet impact
 * Runaway global warming

Crunches

 * Resource depletion or ecological destruction
 * Misguided world government or another static social equilibrium stops technological progress
 * “Dysgenic” pressures
 * Technological arrest
 * Something unforeseen

Shrieks

 * Take-over by a transcending upload
 * Flawed superintelligence
 * Repressive totalitarian global regime
 * Something unforeseen

Whimpers

 * Our potential or even our core values are eroded by evolutionary development
 * Killed by an extraterrestrial civilization
 * Something unforeseen