2018 Annual Report

12 December 2018

GCRI may now be at a turning point. We have established ourselves as leaders of the field of global catastrophic risk, with an important and distinctive role centered on the development of intelligent and practical solutions to reduce the risks. We have built up a solid track record of accomplishments of basic research and outreach to translate research ideas into action. However, we are not accomplishing nearly as much as we could because we currently only have the resources to operate at a small scale. Therefore, we are currently focused on fundraising and scaling up the organization in order to realize our potential for addressing global catastrophic risk.

Accomplishments

In 2018, we have had a major focus on conducting outreach to important decision-makers. We are increasingly of the view that our impact on the risks is limited less by our understanding of the risks and more by our ability to translate that understanding into action. This holds both for GCRI and for the wider global catastrophic risk community. (A similar observation is made by other people here.) GCRI fortunately has a high degree of institutional flexibility, which enables us to switch between research and outreach as opportunities arise.

Our largest 2018 outreach activity was on the national security dimensions of artificial intelligence. AI is just starting to be treated as an important national security issue. Because of that, right now there are opportunities to “get in on the ground floor” and influence the long-term direction of AI policy. To that end, we participated in numerous discussions with senior US congressional staff and other policy insiders on matters pertinent to AI and national security. We also helped facilitate the participation of the broader AI and GCR community in these discussions. This work has had a tangible impact on the AI national security policy conversation, in particular by elevating concern for harmful unintended consequences of AI. The work has also set GCRI and our colleagues up for ongoing high-level contributions.

Our 2018 research included a mix of risk and decision analysis for evaluating the risks and solutions & strategy for reducing them. We published papers on the probability and impacts of nuclear war, which provide what we believe to be the most detailed nuclear war risk models ever published in the public domain. We published a paper Long-term trajectories of human civilization, which breaks major ground on the question of whether extinction risks should be prioritized over the risk of global catastrophes that leave some survivors. We published a paper on asteroid risk in order to make a wider point about uncertainty in global catastrophic risk and its implications for decision-making. Finally, we published papers on skepticism and misinformation about artificial superintelligence, which open up a major new line of research and outreach for improving long-term AI outcomes.

This emphasis on risk and decision analysis has been a longstanding theme for GCRI. We are respected participants in the professional risk and decision analysis fields and pioneers of the application of risk and decision analysis to global catastrophic risk. Risk and decision analysis are essential for evaluating how to allocate scarce resources like philanthropic funding and policymaker attention. They are also essential for evaluating tradeoffs in the many situations in which an action can decrease one risk and increase another. At this time, GCRI is drawing on our experience to develop best practices for the use of risk and decision analysis for global catastrophic risk. Details are discussed here.

In addition to outreach and research, our two other main areas of activity have been mentoring and organization development. Our organization development work is discussed elsewhere in this blog post. Our mentoring work has traditionally involved long-term relationships over several years, often resulting in co-authored publications (such as this, this, and this) and reference letters for jobs and grad school. In 2018, I also did some more short-term career advising for the 80,000 Hours organization. GCRI recently began an overhaul of our outside affiliates program that is improving our mentoring operation. As senior leaders of the field of global catastrophic risk, we are able to mentor people at all career points, and we believe it is important for us to do so.

Plans

Broadly speaking, GCRI is currently focused on two types of plans: our plans for scaling up the organization and our plans for direct project work. Scaling up the organization is essential for us to have a larger impact on global catastrophic risk. As we scale up, we will do more and better projects that address global catastrophic risk.

There is a strong case for GCRI scaling up. First and foremost, we play an important and distinctive role in the global catastrophic risk community. We advance an intelligent and practical response to global catastrophic risk. We do this by working at the intersection of deep academic scholarship and real-world decision-making. To that end, we bring unique expertise on risk and decision analysis, social science, and the technical details of the full range of risks, in addition to our extensive professional networks. We also have a unique institutional structure—an independent nonpartisan think tank based in the United States and with no central office—that enables us to play this role well. Finally, we have an efficient and cost-effective operation, backed by a parent organization, Social & Environmental Entrepreneurs, which provides comprehensive back-end administrative support for a very low 6.5% overhead rate. Without in any way diminishing the work being done by other organizations, there is clearly an important role for GCRI, a role we could play even better if we scale up.

Supporting the global catastrophic risk talent pool and assisting funders are two types of direct project work we have planned. We also plan research and outreach. As noted above, we plan for outreach to be an increasingly large focus of our work. We likewise plan to increasingly align our research with our outreach. We believe this will maximize our impact on the risks.

GCRI has developed specific research and outreach plans across the seven global catastrophic risk topics we work on:

Aftermath of global catastrophe: We plan to study how well any survivors of a global catastrophe would fare, in order to evaluate the relative importance of different global catastrophic risks and to evaluate opportunities to aid survivors.

Artificial intelligence: We plan to use social science research and outreach to develop practical solutions for reducing AI risks (for example in the context of international security), to use risk and decision analysis to guide decisions about AI risk, and to apply select social science and philosophy insights to the study of AI ethics.

Cross-risk evaluation & prioritization: We plan to evaluate important decisions that cut across multiple global catastrophic risks, such as how to allocate scarce philanthropic funding and other resources, and how to deploy AI, nanotechnology, and other technologies that can decrease some risks and increase others.

Nanotechnology: We plan to assess whether long-term nanotechnology should be getting more attention than the near-zero amount that it currently receives.

Nuclear war: We plan to conduct outreach to use our existing nuclear war risk and decision analysis to inform major policy and philanthropy decisions.

Risk & decision analysis: We plan to continue analyses of specific risks, especially AI, nanotechnology, and nuclear war, and to develop guidelines for the use of risk and decision analysis for global catastrophic risk.

Solutions & strategy: We plan to develop solutions and strategy for specific risks, especially AI, nanotechnology, and nuclear war, and to promote practical approaches to global catastrophic risk reduction within the broader global catastrophic risk community.

These plans cover more work than we will likely be able to do in 2019. The particular mix we will focus on will depend largely on what we get funding to do. Recently, we have gotten funding mainly for work on AI, and it is likely that this will continue to be the case in 2019. We see an abundance of good opportunities for us to work on AI and are happy to work on it. However, all of the above topics are important and worthy of attention and funding.

Funding

GCRI’s productivity and ability to scale up is currently limited mainly by the availability of funding. Indeed, we have accomplished quite a lot despite having very limited funding. As one outside analysis put it, GCRI’s budget has been “shockingly low considering their productivity”.

In order to scale up, we must raise considerably more funding than we previously have. In previous years, we never raised more than around $200,000. Now, we are seeking to raise $1.5 million. This amount would fund our leadership team for three years plus a modest discretionary budget that we would use to expand further. We could probably put more than this amount to excellent use, and it would be reasonable to fund GCRI at a higher level. Regardless, we hope to partner with funders who can make additional funds available as GCRI grows and gets new opportunities.

There are several reasons why GCRI has previously not gotten larger amounts of funding. One major reason is a chicken-and-egg problem in which organizations often need to be large in order to attract the large amounts of funding needed to be large. Indeed, as documented here, small global catastrophic risk organizations tend to have a hard time getting significant funding even when they are doing excellent work and funding is available. Another reason is that global catastrophic risk funding is almost always designated for specific risks, whereas GCRI specializes in working across risks. We have gotten more funding in recent years by emphasizing work on specific risks, especially artificial intelligence and nuclear weapons, but we still hope that more funding can be made available for cross-risk work.

Traditionally, there have also been relatively few funding options for global catastrophic risk. For example, both the National Science Foundation and the more mission-oriented US government funding programs generally favor funding work on risks that they believe are higher probability and less speculative. However, recently there has been significant growth in global catastrophic risk funding, so we are more optimistic about GCRI’s fundraising prospects now than we have been in previous years.

Concluding Remarks

To reduce global catastrophic risk, it is essential to understand the risks and the opportunities to reduce them, and to translate this understanding into real-world actions. Doing this well is not easy, because the risks are complex and the people and institutions that affect the risks are often not particularly motivated to reduce them. In order to be effective, global catastrophic risk projects must pull all of this together in order to craft solutions that make significant reductions to the risks and can actually be implemented. This is precisely GCRI’s specialty.

GCRI has built up a deep expertise and capacity for global catastrophic risk project work. We now seek to scale up so we can do more to address global catastrophic risk. Meanwhile, there is a growing amount of funding available for global catastrophic risk, though much of it does not go to small organizations like GCRI. Therefore, we believe the timing is excellent for GCRI to scale up. We hope to be able to do so.

To support GCRI, please visit our donate page.

Recent Updates from GCRI

Open Call for Advisees and Collaborators, September 2024

UPDATE: The call for advisees and collaborators is now closed. Thank you to everyone who applied. However, anyone interested in seeking our advice and/or collaborating with us is still welcome to contact us as per the instructions below and we will include them in our...

2023 Annual Report

2023 was a year of learning and transition for GCRI, and likewise a relatively quiet year for us. A year ago, we lost two team members, McKenna Fitzgerald and Andrea Owe, leaving the GCRI team with just its two co-founders, Seth Baum and Tony Barrett. We certainly...

2023 GCRI Fellowship Program

GCRI is pleased to announce the 2023 Fellowship Program. The GCRI Fellowship Program aims to highlight exceptional collaborators GCRI had the opportunity to partner with over the course of the year. This year, we have three 2023 Fellows. One of them is collaborating...