Tony Barrett Gives CSIS Practice Talk on Inadvertent Nuclear War to GCRI Nuclear War Group

28 October 2012

On Thursday 11 October 2012,* GCRI hosted the second of a series of discussions among a group of nuclear war scholars. The discussion centered around a practice talk that Tony Barrett gave for an upcoming conference at the Center for Strategic and International Studies (CSIS).
* We apologize for the delays in getting this post online.

Meeting participants included Martin Hellman of Stanford and Seth Baum, Tony Barrett, and Jacob Haqq-Misra, all of GCRI.

Barrett’s talk, Analyzing and Reducing the Risks of Inadvertent Nuclear War between the United States and Russia, was prepared for the fall 2012 conference of the CSIS Project on Nuclear Issues. The talk was based on research Barrett lead for GCRI with Seth Baum and Kelly Hostetler.

Inadvertent nuclear war occurs when one nation believes incorrectly that it is under attack, and then uses nuclear weapons in what it believes is a counterattack. The result is a nuclear war that happens “by accident”. While inadvertent nuclear war has never previously occurred, there have been several close calls, including the 1983 Able Archer incident and the 1995 Norwegian rocket incident.

For the previous GCRI nuclear war group discussion, see GCRI Hosts Discussion Of Nuclear War.

Recent Publications from GCRI

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

The issue of democratic backsliding has gotten extensive attention in recent years. That is the issue of countries becoming less democratic than they previously were. Often, it is discussed in terms of governments becoming more authoritarian. That is undoubtedly very...

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Depending on who you ask, the AI industry is either on track to render humans obsolete within the next few years or about to go bust, or maybe somewhere in between. This matter is among the most important questions right now, not just for global catastrophic risk but...

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

It may seem ridiculous to entertain the idea of world peace at this time. With major active wars in Ukraine, the Middle East, Myanmar, and multiple locations in Africa, plus smaller conflicts and tense relations between many countries and subnational factions, this is...

Recent Publications from GCRI

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

The issue of democratic backsliding has gotten extensive attention in recent years. That is the issue of countries becoming less democratic than they previously were. Often, it is discussed in terms of governments becoming more authoritarian. That is undoubtedly very...

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Depending on who you ask, the AI industry is either on track to render humans obsolete within the next few years or about to go bust, or maybe somewhere in between. This matter is among the most important questions right now, not just for global catastrophic risk but...

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

It may seem ridiculous to entertain the idea of world peace at this time. With major active wars in Ukraine, the Middle East, Myanmar, and multiple locations in Africa, plus smaller conflicts and tense relations between many countries and subnational factions, this is...

Recent Publications from GCRI

Democratic Participation and Global Catastrophic Risk

Democratic Participation and Global Catastrophic Risk

The issue of democratic backsliding has gotten extensive attention in recent years. That is the issue of countries becoming less democratic than they previously were. Often, it is discussed in terms of governments becoming more authoritarian. That is undoubtedly very...

Parsing AI Risk in Early 2025

Parsing AI Risk in Early 2025

Depending on who you ask, the AI industry is either on track to render humans obsolete within the next few years or about to go bust, or maybe somewhere in between. This matter is among the most important questions right now, not just for global catastrophic risk but...

Advance World Peace to Address Global Catastrophic Risk?

Advance World Peace to Address Global Catastrophic Risk?

It may seem ridiculous to entertain the idea of world peace at this time. With major active wars in Ukraine, the Middle East, Myanmar, and multiple locations in Africa, plus smaller conflicts and tense relations between many countries and subnational factions, this is...