Getting Smart About Global Catastrophes

by

16 March 2015

This article discusses the importance of analysis to develop the best ways of reducing global catastrophic risk.

The article begins as follows:

Global catastrophes are events so large — global warming, nuclear war, pandemics, and emerging technologies such as artificial intelligence and synthetic biology — that they threaten the entire future of human civilization If we fail to we avoid a global catastrophe, nothing else matters. Therefore, preventing a global catastrophe must be a top priority for society today. We should all try as hard as we can on this, but it’s not enough to try. To succeed, we must be smart.

We need to be smart because some actions are a lot more effective than others at confronting global catastrophes. We should not waste our time on inefficient, ineffective little efforts when we could be going big. But how to go big can be a subtle endeavor. That means we need to invest in — and listen to — practical research that helps us confront global catastrophes more effectively.

 Smart action on global catastrophes does two things well. First, it makes large reductions in the risk of global catastrophe. Second, it is easy. The larger the risk reduction, and the easier the action is to do, the smarter it is.

The remainder of the article is available in Medium.

Image credit: US Air Force


This blog post was published on 28 July 2020 as part of a website overhaul and backdated to reflect the time of the publication of the work referenced here.

Related Topics:

Recent Publications from GCRI

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

Recent Publications from GCRI

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

Recent Publications from GCRI

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

The issue of democratic backsliding has gotten extensive attention in recent years. That is the issue of countries becoming less democratic than they previously were. Often, it is discussed in terms of governments becoming more authoritarian. That is undoubtedly very...

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Depending on who you ask, the AI industry is either on track to render humans obsolete within the next few years or about to go bust, or maybe somewhere in between. This matter is among the most important questions right now, not just for global catastrophic risk but...

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

It may seem ridiculous to entertain the idea of world peace at this time. With major active wars in Ukraine, the Middle East, Myanmar, and multiple locations in Africa, plus smaller conflicts and tense relations between many countries and subnational factions, this is...

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

The Nuclear War Scenarios that Lurk Ahead in Ukraine

The Nuclear War Scenarios that Lurk Ahead in Ukraine