IIt is profoundly difficult to deal with risks whose risks may include the global collapse of civilization or even the extinction of humanity. The pandemic has shattered our illusions of security and reminded us that despite all scientific and technological advances, we remain vulnerable to catastrophes that can alter our entire way of life. They are living possibilities, not mere hypotheses, and our governments will have to face them.
As Britain emerges from Covid-19, it could find itself at the forefront of responding to future disasters. The recent government integrated review, Britain is taking the G7 presidency and the COP26 climate conference, to be held in Glasgow later this year, are occasions to address global crises. But to make sure the UK is truly prepared, we must first identify the biggest risks we face in the decades to come.
Technological progress since the Industrial Revolution has ultimately increased the risk of the most extreme events, putting humanity’s future at stake through nuclear war or climate collapse. One technology that may pose the greatest threat this century is artificial intelligence (AI) – not the current crop of tightly intelligent networks, but more mature systems with general intelligence that surpasses our own. Pioneers in AI from Alan Turing to Stuart Russell We have argued that unless we develop the means to control such systems or to align them with our values, we will find ourselves at their mercy.
In my opinion, the chances of AI causing existential catastrophe in the next century are roughly one in six – like Russian roulette. If I am correct about the scale of these threats, then this is an unsustainable level of risk. We cannot survive many centuries without transforming our resilience.
The recent integrated government review highlighted the importance of these “catastrophic impact threats”, paying attention to four of the most extreme risks; threats from AI, global pandemics, the climate crisis and nuclear annihilation. He correctly pointed out the crucial role that artificial intelligence systems will play in modern warfare, but was silent on the need to ensure that the artificial intelligence systems we implement are safely developed and aligned with human values. He underscored the likelihood of a successful biological attack in the next few years, but could have said more about the role science and technology can play in protecting us. And while he mentioned the threat of other countries scaling up and diversifying their nuclear capabilities, the decision to expand the UK’s nuclear arsenal is both disappointing and counterproductive.
To truly transform our resilience to extreme risks, we must go further. First, we must urgently address biosecurity. In addition to the possibility of a new pandemic spreading from animals, there is the even worse prospect of a manipulated pandemic, engineered by foreign states or non-state actors, with a combination of lethality, transmissibility, and vaccine resistance beyond from any natural pathogen. . With the rapid improvements in biotechnology, the number of parts that could create such a weapon is only growing.
To address this risk, the UK should launch a new national biosafety center, as has been done recommended by the joint committee on the National Security Strategy and my own institute at the University of Oxford. This center would counter the threat of biological weapons and laboratory leaks, develop effective defenses against biological threats and foster talent and collaboration across the UK biosecurity community. There is a real danger that the legacy of Covid-19 does not go beyond preparing for the next naturally occurring pandemic, neglecting the possibilities of a man-made pandemic that keeps experts awake at night.
Second, the UK must transform its resilience to the full range of extreme risks we face. We do not know what the next crisis will be on the Covid-19 scale, so we must be prepared for all those threats. The UK’s existing risk management system, within the Cabinet Office’s civil contingency secretariat, is robust in many ways, but only addresses risks that pose a clear danger in the next two years, making it impossible to properly assess hazards that would take more than two years. years to prepare for, such as advanced AI. We also suffer from the lack of a chief risk officer, or equivalent position, who can take sole responsibility for the full range of extreme threats across government.
Third, we must put extreme risks on the international agenda. These are global problems that require global solutions. The jurist Guglielmo Verdirame maintains that, although the climate emergency and nuclear weapons are covered by at least some international laws, there is no global legal regime in force that is commensurate with the severity of other extreme risks, or that has the necessary breadth to face them. with the changing landscape of such risks. The G7 presidency is the perfect opportunity to remedy this. Rather than settle for a treaty on pandemic preparedness, as the prime minister proposes, the UK could elevate its ambitions and lead the call for a new treatise on risks to the future of humanity, with a series of resolutions from the UN Security Council to place this new framework on the strongest legal basis possible.
There is an understandable tendency for even the most important people in government to view extreme risks as too overwhelming to take. But there are concrete steps the UK can take to transform its resistance to these threats, and there is no better time to do so than now. Covid-19 has given us the opportunity to make decades of progress in a matter of months. We must seize this opportunity.
George is Digismak’s reported cum editor with 13 years of experience in Journalism