Annual Report on Global Risks

The Foundation's annual report Global Catastrophic Risks 2016 is the result of a continued partnership between the Foundation and researchers at the Future of Humanity Institute and the Global Priorities Project at Oxford Martin School at Oxford University. The report was commissioned by the Foundation, whereby 15 researchers identified and analyzed some of the most critical Global Catastrophic Risks, which, in the worst case, could eliminate 10% or more of the global population.

The report focuses on various kinds of risks. Climate change is the risk that has so far received the most attention. The researchers stress that it is important to both look at the most likely outcomes, and also focus on scenarios which are less likely to occur that may lead to catastrophic damage – e.g., that global warming exceeds both 4, 6 or 8 degrees Celsius instead of the 2 degrees Celsius which, for example, the climate change conference in Paris focused on.

In addition to climate risks, the report considers nuclear war and pandemics. The researchers also point to a different type of catastrophic risk, the so-called emerging risks. These are, for example, artificial intelligence, geoengineering and synthetic biology.

Common to all emerging risks is that they were created by technologies that are primarily intended to enrich humanity, but that failure or misuse of those technologies can have devastating consequences.

Not only does the annual report describe the risks, scientists at Oxford University have also tried to show how these risks influence each other, how they can be reduced, and who should be primarily responsible for this.

Download the full report | Download the Executive Summary