Similarly, the frequency of volcanic eruptions of sufficient magnitude to cause catastrophic climate change, similar to the Toba Eruption, which may have almost caused the extinction of the human race, has been estimated at about 1 in every 50,000 years. Some risks, such as that from asteroid impact, with a one-in-a-million chance of causing humanity's extinction in the next century, have had their probabilities predicted with considerable precision (although some scholars claim the actual rate of large impacts could be much higher than originally calculated). He thinks the most likely cause would be evolution changing moral preference, followed by extraterrestrial invasion. "Whimpers" are the gradual decline of human civilization or current values. Bostrom believes that this scenario is most likely, followed by flawed superintelligence and a repressive totalitarian regime. For example, if a single mind enhances its powers by merging with a computer, it could dominate human civilization. The most likely causes of this, he believes, are exhaustion of natural resources, a stable global government that prevents technological progress, or dysgenic pressures that lower average intelligence. "Crunches" are scenarios in which humanity survives but civilization is irreversibly destroyed. He thinks the most likely sources of bangs are malicious use of nanotechnology, nuclear war, and the possibility that the universe is a simulation that will end. "Bangs" are sudden catastrophes, which may be accidental or deliberate. Other classificationsīostrom identifies four types of existential risk. Posner's events include meteor impacts, runaway global warming, grey goo, bioterrorism, and particle accelerator accidents. Posner singles out such events as worthy of special attention on cost-benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole. Similarly, in Catastrophe: Risk and Response, Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Bostrom considers existential risks to be far more significant. An existential risk, on the other hand, is one that either destroys humanity (and, presumably, all but the most rudimentary species of non-human lifeforms and/or plant life) entirely or prevents any chance of civilization recovering. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. Those that are at least "trans-generational" (affecting all future generations) in scope and "terminal" in intensity are classified as existential risks. A "global catastrophic risk" is any risk that is at least "global" in scope, and is not subjectively "imperceptible" in intensity. Philosopher Nick Bostrom classifies risks according to their scope and intensity. Scope/intensity grid from Bostrom's paper "Existential Risk Prevention as Global Priority" Global catastrophic vs existential
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |