Can We Predict Drug Efficacy with Artificial Intelligence?

The success of personalized medicine depends on the ability to identify patient sub-populations, which is only possible with accurate diagnostic tests based on biomarkers. Given the vast amount of genomics, proteomics or metabolomics data, identifying effective biomarkers is a complex task. Large amounts of individual patient Omics data are being collected, unfortunately bioinformaticians lack sophisticated tools to extract information from the data. Big computing approaches such as machine learning, artificial intelligence (AI) and neural networks are still in their infancy in the pharmaceutical sector but with several AI companies rewriting the code for drug discovery, the implications in healthcare are likely to be far ranging in the coming years. In particular, newly founded start-ups offer some unprecedented solutions using machine learning and AI.

In a booming biomarkers market, which spans both diagnostics and drug discovery, there are, according to AngelList, 173 companies in the bioinformatics space, with only 15 in Europe. UK-based BenevolentAI is one leader in the use of artificial intelligence (AI) for the efficient diagnosis of diseases and discovery of drugs. The company was founded in 2013 with the idea that the drug discovery process required a complete turnaround. Their mission became to change the healthcare and biotechnology industry which they addressed using knowledge networks to reduce the time and cost of drug discovery and find new molecular targets.

Recently, BenvolentAI’s system was successfully used to identify biomarkers in Amyotrophic Lateral Sclerosis (ALS) disease. They initially reviewed billions of sentences and paragraphs from millions of scientific research papers and abstracts. They then began to link direct relationships between the data and regulated the data into ‘known facts.’ The company then curated these known facts and developed a number of hypotheses against qualified criteria. BenevolentAI’s scientific teams then assessed the validity of these hypotheses and arrived at 20 triaged biomarkers they thought were worth exploring further. From this, the company whittled down to top five compounds, which they tested ALS patient cells. Clinical trials are planned to start later in the year. With some of the drugs tested by BenevolentAI already in development, the time to market for the hits may be reduced.

BenevolentAI’s approach is unique as it uses data from scientific papers and abstracts to make connections and patterns, an aspect that pharmaceutical companies may not have exploited yet. It is therefore no surprise that Janssen, a J&J company, entered an exclusive license agreement with BenevolentAI for clinical stage drug candidates. As Ken Mulvany, Chairman of BenevolentAI, explained, “The data [from scientific articles] might show that a protein up regulates a particular gene which is not directly related, leading researchers to look for drugs in a completely different area.” This is how BenevolentAI is capable of finding novel targets with data mining.

In the United States, AtomWise, founded in 2012, has a slightly different value proposition. They have developed the AtomNet, “the first Deep Learning technology for novel small molecule discovery,” which can predict the bioactivity of small molecule drugs. Using this proprietary algorithm, AtomWise was able to screen 8.2 million molecules to discover a protein inhibitor for the treatment of multiple sclerosis that filled all the requirements for its target product profile: “blood-brain-barrier penetrant, orally available, and highly efficacious in animals.” In another example, the AtomNet algorithm discovered a drug with no previous antiviral properties that blocked Ebola infectivity across strains from different epidemics. With such results, AtomWise was able to partner with large corporations such as Merck and IBM and subsequently raise $6 million in seed funding in order to improve and develop their algorithm. It is certainly possible to envision what drug discovery could look like in the future with AI being an integral part of the pharmaceutical industry.

Currently thousands of candidate molecules are screened with such new approaches and no doubt this strategy is about to revolutionize the way pharmaceutical companies perform screening. First, it will significantly improve time to market. Second, as a result it will also significantly reduce drug development costs. For example, in 2014, the FDA estimated that just a 10% improvement in the ability to predict drug failures before clinical trials could save $100M in development costs per drug. With this method, predictability is expected to increase significantly. Third, after a promising candidate is identified, AI can be used to improve clinical trials from the design to the data analysis.

Computer assisted drug delivery (CADD) has been around since the 80s, particularly in the pharma sector where it was integrated in the drug discovery process. Although it has contributed to modern drug discovery, it hasn’t removed the need for biological data and it surely hasn’t revolutionized the drug discovery process. Big data computing approaches such as AI have the potential to do so but need to solve a couple of issues such as noise in data sets, which slow down the data training process. The other issue is to learn to deal with the complexity and variability found in biology. Once these hurdles are overcome, sophisticated mining processes are likely to increase the individualization of the drugs lowering the cost of drug development, addressing two of the major problems in healthcare. At this stage the success depends on the wider adoption of these novel drug discovery methods within larger biotechnology and pharmaceutical companies as well as the development of novel data mining and big data analysis software.

Jeanne-Françoise Williamson D.Phil, CEO, Williamson Biotech Solutions and Pablo Lubroth, University of Cambridge, MPhil Bioscience Enterprise.


lorem ipsum