GCR News Summary July 2015

4 August 2015

Part of the GCRI series of news summaries

Iran reached an agreement with the P5+1 countries—the five permanent members of the United Nations Security Council plus Germany—to limit Iran’s nuclear program. The deal requires Iran to reduce its stockpile of low-enriched uranium by 98% for 15 years. Iran also agreed to cut the number of centrifuges it currently uses to enrich uranium roughly in half and to enrich uranium to no more than 3.7% U-235 (the isotope that powers both nuclear reactors and nuclear weapons). Nuclear weapons require uranium that is enriched to more than 90% U-235. In addition, Iran will render inoperable its nuclear reactor at Arak, which was capable of producing weapons-grade plutonium, and and will ship the plutonium in the reactor’s spent fuel out of the country. In return, the P5+1 countries will lift oil and financial sanctions from Iran. If Iran abides by the agreement for five years, the P5+1 will also begin to lift weapons sanctions. But if Iran is violating caught the agreement, “snapback” provisions would automatically reimpose sanctions.

Max Fisher wrote in Vox  that a nuclear war in Europe has become “thinkable… even plausible”. Fisher argued that Russia’s relative conventional weakness—as well as President Vladimir Putin’s need to maintain his domestic popularity while the Russian economy is weak—gives Russia an incentive to threaten military escalation in order to gain leverage over the US and NATO. Fisher quotes F. Stephen Larrabee’s recent warning that “the Russia that the United States faces today is more assertive and more unpredictable—and thus, in many ways, more dangerous—than the Russia that the United States confronted during the latter part of the Cold War”. Political forecasting expert Jay Ulfelder surveyed a convenience sample of 100 international relations and conflict experts. Ulflelder estimated from the responses that they roughly think there is about a 2% chance of nuclear war between the US and Russia before 2020. Gen. Joseph F. Dunford, the nominee for Chair of the Joint Chiefs of Staff, told the Senate Armed Services Committee that Russia is the greatest security threat to the US. “If you want to talk about a nation that could pose an existential threat to the United States,” Dunford said, “I’d have to point to Russia.”

Sputnik News reported that Russia’s new missile early warning system is back on schedule, with the first satellite in the space component of the system set to be launched in November. Russia’s new system would reportedly consist of ten satellites in highly elliptical orbits and monitoring stations on the ground. Russia has not had working early-warning satellites since fall of 2014, when the new satellite was originally supposed to be launched.

University of California, Berkeley computer scientist Stuart Russell compared the dangers of artificial intelligence (AI) to the unintended consequence of internal combustion engines. Russell called for more research on how to keep AI from executing out its programs when doing so runs counter to human interests. The Future of Life Institute (FLI) recently awarded $6 million donated by Tesla and SpaceX founder Elon Musk to study the possible dangers of AI (the Global Catastrophic Risk Institute was awarded some of this money to model artificial superintelligence risk). Georgia Tech roboticist Ronald Arkin added that we cannot rely solely on researchers to protect us from the dangers of AI. “Not all our colleagues are concerned with safety,” Arkin said. “The important thing is you cannot leave this up to the AI researchers. You cannot leave this up to the roboticists. We are an arrogant crew, and we think we know what’s best and the right way to do it, but we need help.”

The FLI also published an open letter signed by AI and robotics researchers—as well as by Elon Musk, physicist Stephen Hawking, Apple co-founder Steve Wozniak, and linguist Noam Chomsky—calling for a ban on autonomous weapons systems. The letter said that weapons systems that “select and engage targets without human intervention” could be deployed in a matter of years. While such weapons would potentially be able to fight wars without risking human lives, the development of autonomous weapons could set off a global arms race. Evan Ackerman wrote in IEEE Spectrum that autonomous weapons may be too easy to build to ban them successfully. Ackerman argued that since autonomous may also be better able to follow international law or rules of engagement than human soldiers, we should work to make them behave ethically rather than banning them outright. But as the FLI letter noted, autonomous weapons may allow dictators and terrorists to wield force in a way they way could not before. “Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce,” the letter said. “There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.”

Six new cases of Ebola were reported in Liberia, two months after the outbreak was officially declared over in the country. The source of the new cases was not clear, but tests showed that the virus was genetically very similar to previous cases in Liberia. Preliminary results published in The Lancet suggest that the rVSF-EBOV vaccine may be effective in protecting against Ebola. So far none of the more than 2,000 potentially exposed people who received the vaccine have become infected. Although research is ongoing, the preliminary results are promising that researchers are giving the vaccine to every subject of the study, instead of withholding it from a control group. Researchers also reported in The Journal of Clinical Investigation that an inhalable vaccine was able to protect rhesus macaques against Ebola. The research team plans to conduct a phase 1 trial of the vaccine in humans. Commentary in The Lancet noted that the cost of dealing with outbreaks is generally higher than the cost of preventing them:

Global mitigation of future pandemic risk must focus on the large-scale behaviours that lead to zoonotic spillovers. This approach means engaging with the sectors that drive disease emergence, including industries involved in land-use change, resource extraction, livestock production, travel, and trade, among others.

The authors said that in order to lower the risk of pandemics, we will also have to improve our predictive models of disease emergence and as far as possible prepare for pathogens that have yet to emerge.

The Rockefeller-Lancet Commission concluded that our use of the planet’s resources is not sustainable. The Commission’s report said that we are able to flourish now at the expense of natural systems we will need to thrive in the future:

Health effects from changes to the environment including climatic change, ocean acidification, land degradation, water scarcity, overexploitation of fisheries, and biodiversity loss pose serious challenges to the global health gains of the past several decades and are likely to become increasingly dominant during the second half of this century and beyond.

“By unsustainably exploiting nature’s resources,” the report said, “human civilisation has flourished but now risks substantial health effects from the degradation of nature’s life support systems in the future.”

A discussion paper under review at Atmospheric Chemistry and Physics argues that on the basis of the prehistoric record ice in Greenland and Antarctica may melt faster than previously projected. With 2°C (3.6°F) global warming above the preindustrial level, sea levels are likely to rise by several meters, possibly before the end of this century. Principal author James Hansen, who for a long time headed the NASA Goddard Institute for Space Studies, wrote in a separate essay that, based on the information available, he believes “continued high emissions would result in multi-meter sea level rise this century and lock in continued ice sheet disintegration such that building cities or rebuilding cities on coastlines would become foolish”.

A paper in Geology found that magma flows may provide advance warning of a Yellowstone eruption. Somewhere between months and years before the next eruption, there should be signs that magma is moving into and reheating the crust at the base of the volcano. Yellowstone erupts only rarely—the last lava flow was 70,000 years ago—and there is no sign that it will erupt any time soon. But a major Yellowstone eruption like the one 640,000 years ago would bury a large part of the US in volcanic ash and dramatically alter the global climate.

A British consortium proposed to put a satellite that can monitor solar activity in the L5 point, the gravitationally-stable point that trails the Earth in its orbit around the sun by about 150 million km. The Carrington-L5 satellite—it would be named after the 19th century astronomer who observed the most severe geomagnetic storm on record—would be able to send continuous data on solar activity back to Earth and improve our ability to forecast space weather events that could damage electrical infrastructure.

Yuri Milner, the founder of Digital Sky Technologies, pledged $100 million to a 10-year project called “Breakthrough Listen” to search for signs of intelligent life elsewhere in the universe. Dan Werthimer, a scientist with the Search for Extraterrestrial Intelligence (SETI), said that about a third of the money will fund new listening equipment and a third will go fund new astronomers and new students. Some of the rest of the money will buy researchers time on the biggest radio telescopes in the world. Roger Pielke, Jr. wrote in The Guardian that if we are going to search seriously for extraterrestrial life, we should think carefully about what will happen if we do find something. “Given the profound implications of a discovery of life beyond Earth,” Pielke wrote, “it is irresponsible to embark upon a search without a parallel effort to help society prepare for success in that effort, or even the implications of continued failures.”

This news summary was put together in collaboration with Anthropocene. Thanks to Tony Barrett, Seth Baum, Kaitlin Butler, and Grant Wilson for help compiling the news.

You can help us compile future news posts by putting any GCR news you see in the comment thread of this blog post, or send it via email to Grant Wilson (grant [at] gcrinstitute.org).

Image credit: H. Schweiker/WIYN and NOAO/AURA/NSF

Recent Publications from GCRI

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Download PDF Preprint Is climate change a global catastrophic risk? Warming temperatures are already causing a variety harms around the world, some quite severe, and they project to worsen as temperatures increase. However, despite the massive body of research on...

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Download PDF Preprint Diversity is an important ethical concept. It’s also relevant to global catastrophic risk in at least two ways: the risk of catastrophic biodiversity loss and the need for diversity among people working on global catastrophic risk. It’s...

Recent Publications from GCRI

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Download PDF Preprint Is climate change a global catastrophic risk? Warming temperatures are already causing a variety harms around the world, some quite severe, and they project to worsen as temperatures increase. However, despite the massive body of research on...

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Download PDF Preprint Diversity is an important ethical concept. It’s also relevant to global catastrophic risk in at least two ways: the risk of catastrophic biodiversity loss and the need for diversity among people working on global catastrophic risk. It’s...

Recent Publications from GCRI

Climate Change, Uncertainty, and Global Catastrophic Risk

Climate Change, Uncertainty, and Global Catastrophic Risk

Download PDF Preprint Is climate change a global catastrophic risk? Warming temperatures are already causing a variety harms around the world, some quite severe, and they project to worsen as temperatures increase. However, despite the massive body of research on...

On the Intrinsic Value of Diversity

On the Intrinsic Value of Diversity

Download PDF Preprint Diversity is an important ethical concept. It’s also relevant to global catastrophic risk in at least two ways: the risk of catastrophic biodiversity loss and the need for diversity among people working on global catastrophic risk. It’s...