Should humanity have seen this pandemic coming? A 2007 scientific study called coronaviruses in bats a “time bomb.” In a 2015 TED lecture, Bill Gates warned that “we are not ready” for the next pandemic. Last year, an interdisciplinary team developed a scenario for a global pandemic, caused by a previously unknown coronavirus. Their predictions came uncannily close to today’s reality.
More and more people are worried about the future of our climate—and with good reason. But experts on the risks of global catastrophes have long been at least as concerned about pandemics, especially those caused by lab-created pathogens. Some even speak of existential risks: risks that pose a threat to the survival of our species.
Oxford philosopher Toby Ord has just published The Precipice, his long-awaited book on existential risks. As Ord defines them, existential threats involve either human extinction, the permanent collapse of civilization or an irrevocable dystopia, such as a totalitarian global regime. Ord ascribes a probability of 1/30 (a figure he calculated before the current pandemic) of an artificial pandemic causing existential catastrophe within the next hundred years. This is much higher than the probabilities he assigns to extinction through nuclear war or runaway climate change (1/1000 in both cases).
Is Ord being overly alarmist? We don’t know. Scenarios of human extinction may sound exaggerated, but the mortality rate of the current coronavirus is relatively low compared to that of, say, Ebola. A virus that combines extreme lethality with high infectivity is well within the realm of biological possibility.
Mother Nature could create such a pathogen—but so could we. It is technically possible to modify pathogens in a lab to make them more easily transmissible and deadlier. Such a virus could be used as a bio-weapon by a state actor, or by an apocalyptic sect bent on world destruction. When the Japanese terrorist group Aum Shinrikyo carried out sarin gas attacks in Tokyo subways, they were hoping to precipitate the end of the world. And they also had plans for biological warfare.
Fortunately, at that time, they didn’t have the technology, but that has since changed. Cheap and simple methods of genetic modification, such as CRISPR, have contributed to the democratization of biotechnology. Now almost anyone can tinker with genes at home.
Even without ill intent, accidents can happen. On several occasions, a pathogen has escaped from a laboratory, sometimes with fatal consequences. A few weeks ago, the Washington Post discussed the possibility that the current coronavirus might have escaped from the Wuhan Institute of Virology, a conjecture based on cables sent from the US State Department in 2018 and other circumstantial evidence. A genetic analysis of Sars-Cov-2 in the journal Nature Medicine showed that the hypothesis of an accidental lab release is unlikely in this case, but that doesn’t tell us anything about what might happen in the future.
Fortunately, we are not powerless in the face of such threats. It’s no coincidence that countries such as Singapore, Taiwan and South Korea, now widely regarded as beacons in the fight against Covid-19, were first struck by the SARS and MERS epidemics. Since then, they have developed excellent infrastructure and contingency plans and testing capacities that can be ramped up quickly. Also, their populations have proven particularly alert and vigilant. Past experience may have allowed countries like South Korea to avoid a drastic lockdown thus far.
The debate between proponents and opponents of a lockdown still rages on in many countries. Should we accept that a large part of the population will be infected, or should we try to contain the virus at all costs? The trade-offs are complicated. However, one advantage of containment deserves more attention: it provides us with an excellent dry run for future pandemics. If we give the virus free rein now, we will miss a useful learning opportunity. Next time around, containment may save hundreds of millions, not just thousands.
Biosafety expert Cassidy Nelson of the Future of Humanity Institute in Oxford recently compiled a list of measures to combat pandemics, ranging from the introduction of genetic sequence tests for pathogens to genetic surveillance in public places.
She also addresses the risk of dual-use research (research with both civilian and military applications), such as the genetic modification of pathogens. For example, Dutch virologist Ron Rouchier successfully tweaked the deadly influenza virus H5N1 in his laboratory to make it transmissible between mammals. Such gain-of-function research is meant to further our knowledge of pathogens, so that we can better combat them, but the Dutch government was extremely concerned about Rouchier’s work and contested the publication of his research. After a prolonged legal battle, additional safety restrictions were introduced, preventing the publication of his research without expert permission. In the future, Nelson suggests, we should carry out such cost/benefit assessments in advance of the actual research.
Churchill’s saying “Never waste a good crisis” need not be understood as cynical. The current crisis provides an opportunity to develop contingency plans for future pandemics, as well as for the other catastrophic risks that experts like Ord have identified: nuclear wars, extreme climate change, super-volcanoes and the possibility of unaligned artificial intelligence turning against us.
These scenarios have one thing in common: the chances that they will ever occur may be small, but it is prudent to guard against them, given the high levels of uncertainty and how much is at stake. If we succeed, we increase the chances that human civilization will survive for many centuries—perhaps even for millions or billions of years. But if things go terribly wrong even once, we will be history.