International Colloquium on Catastrophic and Existential Risk
/On March 27-29, 2017, the B. John Garrick Institute for the Risk Sciences sponsored an international colloquium on catastrophic global risks that could threaten human existence - existential risk.
Leading scholars and investigators researching existential risks were represented, including those from two prominent centers on catastrophic and existential risk, the Future of Humanity Institute, University of Oxford, and the Centre for the Study of Existential Risk, University of Cambridge, both of the United Kingdom.
Embedded in the challenge is guidance for recovery and how best to cope when all else fails. There was clear evidence that there has been very limited research on existential risks and even less attempts at quantifying their likelihood of occurrence. While quantitative risk assessment is reasonably mature for such risks as nuclear plant accidents and the benefits have been dramatic, existential risk assessment models and applications are still very much in the academic and study stage. It is clear that much more could be done to quantify existential risks even with current methods, but the case has not been made for the need to do so. Nevertheless, there is a need for additional specialized methods and technology to analyze global and existential threats.
An outcome of the colloquium was increased confidence that the greatest existential threats are not necessarily the naturally occurring events such as asteroid impacts, space weather, super earthquakes and tsunamis, and super storms; but anthropogenic events, that is, people initiated events. Those existential risks identified of greatest concern are technological in nature. While catastrophic consequences could result from biological and nuclear terrorism, they are not viewed as the greatest anthropogenic existential threats. The greatest existential risks were categorized as synthetic biology, nanotechnology weaponry, machine super intelligence, and those not yet identified. A key consideration is how to control these anthropogenic threats without constraining the constructive advancement of science and technology for the benefit of society.