How to survive a plague – why AI is key to fighting the next major pandemic

A new pandemic could kill 150 million and wreak havok on the planet; an AI expert proposes a plan of defense

Elsevier editorial illustration

It’s long been held that we’re overdue for a deadly pandemic, yet as we enter 2019, we still have little clue as to where it might come from. Could “the next big one” be a zoonotic disease – like SARS, Ebola or Zika – or a disease synthetically created and engineered by humans to be as lethal as possible?

This was the topic when diplomats from around the world gathered in Geneva for the annual Biological Weapons Convention (BWC), renewing their nations’ commitment to preventing the development, production and stockpiling of biological weapons. Previously, the BWC has been largely concerned with issues like anthrax – but this year, conversation also turned to the possibility of a pandemic.

Regardless of where the next pandemic comes from, one thing almost everyone agrees on is that we’re not nearly as prepared as we need to be. A recent “plausible scenario” drill run by the Johns Hopkins Center for Health Security (CHS) saw a potential pathogen killing as many as 150 million people worldwide while causing economic chaos, famine and rioting.

As Microsoft founder Bill Gates warned recently at a lecture for the Massachusetts Medical Society: “… there’s one area where the world isn’t making much progress, and that’s pandemic preparedness”

Referring to his talk at the 2017 Munich Security Conference, Gates said:

I asked world leaders to imagine that somewhere in the world a new weapon exists or could emerge that is capable of killing millions of people, bringing economies to a standstill, and casting nations into chaos. If it were a military weapon, the response would be to do everything possible to develop countermeasures. In the case of biologic threats, that sense of urgency is lacking. But the world needs to prepare for pandemics in the same serious way it prepares for war.

In recent years a number of potential pandemics, such as Ebola, have been relatively non-contagious. However, this isn’t always the case. Last year marked a century since the Spanish Influenza, a disease so virulent and deadly it wiped out more than 3 percent of the planet. Given the increased level of global mobility and growth in human population since 1918, if a similar disease struck today, experts believe the death toll could be over a quarter of a billion people.

As in almost all other areas of R&D and healthcare, organizations from governments to pharma companies to hospitals are exploring how technology – namely artificial intelligence – can help with a whole host of healthcare challenges. It’s natural then, in this information age, to turn to AI when we think about preparing for and responding to a pandemic. The potential for AI to help is huge, not only in finding new therapies but in revisiting existing research and data to see what answers might lie there that could be repurposed. However, there are barriers to overcome if we are to consider AI as part of the arsenal in treating a future outbreak.

Using AI to fight pandemics

AI offers the chance to respond effectively to the threat of a pandemic because it is capable of working much faster than attempting to do all the research manually. Already, AI platforms are able to draw on tens of millions of scholarly papers, datasets and journal articles at once, compared to the 200 to 300 the average human researcher reads each year.

However, AI is not a magic bullet; it can only ever be as good as the data it’s provided, and much of the data in the pharma industry is not up to scratch. The scale of the problem is thrown into sharp relief by research showing that just 3 percent of companies’ data meet even basic quality standards. Challenges that can arise when trying to analyze large, disparate datasets include:

  • Lack of data
  • Incorrect data
  • Siloed data
  • Formatting issues
  • Language barriers
  • Incompatible tools

Matthew Clark, PhDAll these issues can cause data to be incorrectly interpreted by the AI algorithms, which introduces bias to all subsequent calculations because AI is subject to the “garbage in, garbage out” (GIGO) principle. As Dr. Matthew Clark, Director of Scientific Services for Elsevier’s R&D Solutions, commented recently:

Algorithms can only extrapolate from what is known; even the best algorithm in the world will still yield poor results if it’s only given half the data it needs. For AI/deep learning, the standard of data needs to be much higher than for human researchers, not only in terms of accuracy but in terms of being free of bias.

Because of this, researchers need to ensure they are providing comprehensive, quality data to their AI platform, even though this can be extremely challenging considering pharma companies have researchers working in different countries and languages and using different tools. Often, data is accidentally siloed in several locations when researchers save their work locally rather than on a central server. Yet if these technical problems can be solved, AI may hold the key to solving the biggest challenges involved in responding to a pandemic – cost and time.

Providing a rapid response through drug repurposing

Scientists have estimated that minimum cost for developing a vaccine against the 11 most likely diseases to be involved in a pandemic is between $2.8 billion and $3.7 billion – a figure which could be decidedly optimistic given the cost of combatting SARS in 2003 was $54 billion. Time, meanwhile, is even more important. When any virulent new sickness appears, finding a cure swiftly has to be the number one priority. AI can help overcome these challenges by radically improving existing techniques, such as drug repurposing; some options which are currently being examined include:

  • Genetically modified organisms (GMO) treatment, where DNA is deconstructed using molecular biology and recombined on an industrial scale to produce a “DNA vaccine.” This approach, which is already being used to help fight hepatitis B, is being explored for a host of other diseases as well.
  • Common condition repurposing, where scientists identify the symptoms of a new potential pandemic and craft a drug in response by using elements of existing drugs that address certain conditions, as is currently being trialed for Dengue and Yellow fever.
  • A universal flu vaccine which aims to boost influenza-specific T-cells, a development that would be particularly beneficial to the elderly, who would be most at risk from flu.

Firms are already attempting to discover new uses for their existing drugs, but the process isn’t as efficient as it could be because repurposing requires researchers to carefully trawl through reams of existing data to spot patterns or potential breakthroughs. By deploying AI to do the same task, firms could drastically improve the efficiency of their repurposing efforts in the event of a pandemic.

Getting the foundations right

The idea that we’re overdue for the next pandemic is a terrifying one, especially should it be as deadly as the fictional one created by the CHS referenced in the introduction. Yet we cannot just pretend it isn’t going to happen at some point. Instead, we should be doing everything we can to make sure we are prepared for when it does strike. The foundations of our response should involve repurposing existing research and using AI platforms to ensure we’re optimizing our chances of coming up with a cure. For AI to be a part of the solution, though, we must overcome the obstacles outlined here.

Given the volume of data in question, trying to tackle the problem of de-siloing and harmonizing data manually is simply not an option. Instead, firms and industry bodies must invest in life sciences platforms capable of intelligently gathering and contextualizing data to the point where it is “AI-ready.” Therefore, organizations involved with public health – pharma companies, universities and governments, to name a few – all have a responsibility to make sure their AI platforms are as effective as possible. Investment in preparing the data today will be the key to fighting off the biggest threats of tomorrow.


Written by

Jabe Wilson, PhD

Written by

Jabe Wilson, PhD

Dr. Jabe Wilson, Consulting Director of Text and Data Analytics at Elsevier,was recently named to the DataIQ 100 — a list of the most influential people in data-driven business. Recognized as a “titan” of data-driven business for his work with data and analytics, Jabe is showcased in his own page on DataIQ’s website, where he shares his thoughts about data, predictions for 2018 and his hopes for the industry.

Related stories


comments powered by Disqus