The First Colloquium on Catastrophic and Existential Risk

Will humans make it through the next century?

It's not a frivolous question. A surprising number of futurists give us a good chance of not making it - as there is no shortage of existential risks. There are people-driven threats, such as the spread of nuclear weapons, bioterrorism and the misapplication of new technologies (such as nanotechnology artificial intelligence). The access of extremists to ever-cheaper and ever-more potent weapons represent more malevolent forms of these threats. Then there are natural threats such as changing climate, overpopulation, and the chance of a large diameter asteroid impact. Or (perhaps worse), the end could come via something we haven't seriously considered.

These threats are quite real, but they're orphans because of the propensity of world leaders to make decisions based on a small timeframe. Long-term strategies are fascinating but they don't win elections. Yet at the existential level, it's essential to think ahead, to make investments in the distant future, and to spend current resources on risks that are small now but that over the long term might add up to something quite formidable.

What to do?

For centuries, we have turned to our great academic institutions to find the difficult answers - answers about disease, energy, transportation, food production, how the universe works, and generally about improving our quality of life. Most of these advances have been easy to fund because they've produced immediate, dramatic results. But what about breakthroughs that produce little in the short-run but that might prove critical down the road? What greater mission is there than sustainability of the species?

The best institutions addressing these matters are not in the United States. The Future of Humanity Institute at Oxford University and The Centre for the Study of Existential Risk (CSER) at Cambridge University are both in the United Kingdom. What we have in the United States are mostly ‘awareness groups’ which seek to identify problems but are not equipped to find technical and engineering solutions.

The B. John Garrick Institute will be a flagship international venue for dealing with these problems. But we need to be creative and do something iconic to position ourselves to be the leader in dealing with catastrophic and existential risks. We need to do something that will make us the center of an international quorum of scholars on such risks. That ‘something’ we believe is our instigating, organizing and hosting a high profile international colloquium at UCLA involving the world's leading scientists and thinkers in matters of existential and catastrophic risk. Inquiries with world leaders in the field have been overwhelmingly supportive.

To that end, the B. John Garrick Institute will host the first Colloquium on Catastrophic and Existential Threats at UCLA’s Luskin Convention Center during period 27-29 March 2017. Some of the most eminent personnel in the field have been invited to participate. It is our intent to contribute to the pioneering work of the groups listed above, and United States groups such as the Future of Life Institute, the Machine Intelligence Research Institute, the Global Catastrophic Risk Institute , and the Foresight Institute. There are many more institutes and groups such as the voluntary group The Future Watch engaging the topic of catastrophic and existential risk. The desire is to extend the work of all the groups on prevention and mitigation and methods for predicting and managing catastrophic risks. Policy and leadership issues for taking action will also be part of the program. One purpose of the colloquium is to develop source material that could evolve into an international roadmap for taking action to prevent, mitigate, or defer catastrophic and existential threats.

Some of the attendees will include CSER’s Executive Director, Seán Ó hÉigeartaigh. Albert Carnesale, the former chancellor of UCLA, provost of Harvard University and dean of the Kennedy School of Government at Harvard, will be attending. Christine Peterson, the Co-Founder and Past President of the Foresight Institute will be another distinguished guest. She serves on the Advisory Board of the Machine Intelligence Research Institute, and has served on California's Blue Ribbon Task Force on Nanotechnology and the Editorial Advisory Board of NASA's Nanotech Briefs.  Seth Baum, the Executive Director of the Global Catastrophic Institute will also be attending.  This list is by no means exhaustive, and is growing daily.

The Colloquium will produce comprehensive documentation of the proceedings and video recordings of all presentations. The Colloquium will involve a series of lectures from distinguished scholars on topics such as the status of current knowledge on global and existential risk, mitigation and prevention, prediction and management, cost benefit of different action scenarios, and policies for implementation. These lectures will subsequently be synthesized with key topics identified.

The progress and outcomes of this Colloquium will be continually updated on the Institute’s website, www.risksciences.ucla.edu.