The future is now: The transformative power of AI

pharmafile | February 5, 2018 | Feature | Business Services, Manufacturing and Production, Medical Communications, Research and Development, Sales and Marketing AI, Exscientia, GSK, GlaxoSmithKline, artificial intelligence, drug development, pharma 

With the technologies reaching exciting levels of maturity and adoption, Matt Fellows investigates how AI and machine learning can be transformative for some of pharma’s age-old problems.

The pharmaceutical industry is not exactly known for its willingness or promptness in adopting new technologies, particularly the potentially disruptive kind. As a result, it’s somewhat surprising to see such a sizeable wave of praise and support for artificial intelligence on the scale that we are currently seeing, alongside the emergence of a wide range of technology and platform providers to fill this burgeoning appetite.

The development of AI’s has many figures within the life sciences excited, presenting new opportunities on a level which has just not been possible in the past. The technology has flexed its computational muscle in a number of impressive feats over the past years, not least of which was AlphaGo’s victory over the world champion Lee Sedol in a five-game match of Go in 2016. AlphaGo is the brainchild of DeepMind, a UK-based artificial intelligence firm which was acquired by Google in 2014.

The company later developed a more generic version of the same program, AlphaZero, repurposing it and employing it to prove its mettle in a range of other board games such as chess and shogi. AlphaZero proved it was able to beat the other most powerful programs at each respective game, famously defeating Stockfish, arguably the world’s best chess-playing computer program. Using reinforcement learning, whereby the program plays against itself in order to learn the game, AlphaZero was able to claim the victory after teaching itself for less than four hours.

The distinguishing factor between this form of AI and previous incarnations is that AlphaZero is able to employ machine learning to decipher how to complete directives with no human input besides the basic rules of the game. 

Remarkable achievements like these really illuminate the progress that the technology has made and what has been made possible as a result. Of course, those in the pharma and life sciences industries are keen to harness this power to apply it to traditional problems in their own space.

Within life sciences, a recent survey from not-for-profit organisation The Pistoia Alliance found that 44% of industry professionals queried were already using or experimenting with AI, illustrating the breadth of interest in the technology within the industry today. It also revealed the most common application of the technology, with a majority of 46% of AI projects taking place within early phase research, while 15% of projects were within the development and clinical stage and 8% were within imaging analysis.

In line with these findings, the predominant space where drug companies are applying AI is the drug discovery stage. The technology can digest vast amounts of data and make calculations at a level far exceeding human capabilities to predict how new molecules will behave and streamline the often trial-and-error approach to identifying new medicines.

Why AI?

Pharma-AI collaborations are becoming increasingly common and are beginning to make the headlines; one of the frontrunners in securing these partnerships is Exscientia, a UK-based AI firm which signed multi-million pound deals with industry giants GlaxoSmithKline and Sanofi in 2017.

Pharmafocus spoke to GSK’s Darren Green, Director of Molecular Design, to find out exactly what makes the technology so attractive to one of the world’s leading pharma companies.

“Efficiency is a big drive,” Green said. “You can see how you can do things faster, and in turn cheaper. The other side of it, which is perhaps more specific to the life sciences, is that modern science generates a lot more data, where we miniaturise tests and can make them high-throughput or we can generate data from diseased patient cells where, in the past, you couldn’t.”

He continued: “Patients now generate so much more data which is directly applicable to what we do. We need to be able to deal with that, and that’s exactly where machine learning and AI fit.”

AI brings the promise of faster development and shorter timelines, creating, ideally, quicker routes to market, which is a win-win situation for drug makers and the patients themselves. With the value of data and real world evidence on the rise, the tool provides a much more efficient way of harnessing that value – potential cost savings are the perennially-sought icing on the cake.

It’s no surprise at all that the excitement and enthusiasm for the use of AI within the industry seems to be hitting its peak, but the application of the technology in this space is nothing new and this is even true of other sectors outside of the life sciences. Derivatives of AI, as we know it now, have been in use by pharma firms for years, so how does a technology offering like Exscientia’s differ from what has come before? Pharmafocus spoke to the company’s Chief Operating Officer, Mark Swindells, to find out more:

“Machine learning, neural networks and data have existed for a long, long time, but putting these things together in an effective manner has not been done before. If it had, companies like GSK wouldn’t be collaborating with us,” he explained. “You can have an engine and a set of rails, but nobody bothers inventing a train. I don’t doubt that the individual components pre-existed, but what changes things is a better understanding of how computational approaches can be applied to data to give meaningful results. It requires development in all areas: more powerful computing, better data and using the best algorithms for the challenge at hand.”

What’s in a name?

As is often the case during discussions within the space, the twin terms ‘artificial intelligence’ and ‘machine learning’ quickly emerge, which are often confused and conflated when it comes to the application of the technology. The term ‘artificial intelligence’ often appears as something of a misnomer, perhaps more fittingly described in this context by using the label of ‘machine learning’. But what is the difference between the two terms? Green gave his thoughts on the matter:

I’m not sure anyone is using what you’d call ‘artificial intelligence’ in drug research, in the sense that they are ‘self-learning’ machines,” Green muses. “I think we are mostly using very advanced machine learning algorithms. One day, maybe you’ll get to the point where they are self-learning. There’s a big difference between what we can do and what we can’t do in the life sciences space.

“Machine learning is directed – teaching a model to predict a certain thing – and you input the data. We’ve been using what you’d call machine learning for quite some time, but the sorts of algorithms that people are coming up with now are just better.” 

Swindells used his expertise in the area to offer his interpretation: “For AI to work, you need three key components,” he expounded. “One of them is algorithmic approach, for which machine learning is just one of a huge number of examples of ways that you can approach a problem. Deep learning, which has been popularised by the DeepMind Go challenge, is a development on machine learning.

“Then you need data to learn from, and you also need a computing capacity to deal with that,” he continued. “You really need all three to actually have AI working. So, if anything, machine learning is just a subset of AI. What we’ve found is that AI, when encoded really well by people with a good understanding, as we believe we can, can start to do certain things better than a human can, without a doubt.

“However, you still need the expert human to set the strategy and assess the results as they come out because, of course, the computer isn’t infallible – in certain situations it may see something it hasn’t really seen before. At that particular point, the human adds their input. It’s very different to something like a Go game where you can get the computer to play itself and it can play one million games and learn from those mistakes.”

Both Swindells and Green highlight the crucial distinction in the use of the term ‘artificial intelligence’. While the impressive maturation of the technology and our perhaps sensationalist understanding of it as it stands today may lead us to believe that the workings behind DeepMind’s AI and the toolset provided by Exscientia are the same, there are significant differences. One is built to be almost entirely autonomous with the ability to teach itself, and the other is very much reliant on human input in the way it processes data. Though the terms applied to them may vary, this important distinction is critical to understanding the technology’s operation in a drug development environment.

Saving valuable time

As Swindells notes, the advancement of computing power has enabled a wealth of opportunities for drug discovery, presenting efficient operation on a scale that far eclipses human capabilities.  As Green mentioned, one of the biggest benefits brought by AI is speed: “We’ve been looking a lot at how we can improve the speed in which we do things. I work in discovery research, which is about 12 years from market. We’d love to speed up what we do so that we get medicines to patients faster,” he said. “What appealed to us with Exscientia was that, although a lot of the ideas in this space have been around for probably a decade, they were the first company that we saw that had put it all together, made it work, and had real demonstrations that they could take a couple of years off our discovery time. That’s what we’re hoping to see from our collaboration with them. We are looking to dramatically change the speed at which we can do that discovery step.”

Exscientia certainly believes it can deliver on this goal; the company’s Chief Executive Andrew Hopkins has claimed that use of the technology can cut drug development time by up to 75%. Swindells explained how the technology makes this possible:

“It takes about 2,500 compounds to get to a candidate, so we thought that there must be a more efficient approach. So we set ourselves this target of trying to do all of our projects in 500 compounds or less, which will deliver enormous productivity and time enhancements. When we are actually getting the AI system to design the compound to be made, the algorithm is asking really key questions about how to not only design novel chemical matter and consider target potency, but at the same time balance against selectivity targets, antitargets, ADME, synthetic stability and so on, to get the best properties into the molecule. They are then synthesised and assayed, with the data being rapidly fed back into the AI algorithm so that it can then learn what to propose next. It’s a bit like driving down the road, and you can see something in the distance on the horizon, and the closer that you get, you’re getting better vision on what will become the candidate. The algorithms are learning much faster than a human can. Computers make more balanced decisions at each design step that encode more information than a human typically would do. What the humans are doing on the side is shepherding this process and monitoring it.

“It takes four or five years typically to deliver a candidate; if you can be very smart about your design process and you can achieve your candidate molecule in fewer compounds, then you can also do it quicker, so there’s an opportunity there,” Swindells continued. “If you can deliver your candidate, say, in one and a half years rather than five, then that is achieving a three and a half year advantage in the overall development time – we think that’s one of the really big opportunities for the industry.

“Things should get to clinic faster, and then by definition will be available for testing in human patients faster than they would normally, and that’s a huge gain. People often read the newspapers and hear about a new technology that could be transformational, but it’ll take a decade; if we’re taking off three and a half years off the discovery element, that really could be transformational in terms of getting these ideas through to a clinical setting, and that’s what we can really offer.”

Exscientia’s technology, and AI in general, brings a new methodology which proves to be disruptive for the subscribed approach traditionally employed by pharma, and perhaps this is just what the industry needs. Green provided a little more information on exactly how the platform integrates with GSK’s processes: “Discovery is essentially a classic design, make and test cycle, familiar to lots of industries. We go round that cycle many times in drug discovery – you learn something, you design a better molecule, and you go round and round. The use of the machine learning platform actually puts a quite rigorous framework around that design, make, test cycle. We design a set of molecules, we make them, and we wait until we have all the data back before we go round again. At the moment, we don’t have a lot of discipline around that cycle – we tend to act on interesting data as and when it comes. The Exscientia argument is: don’t do that; wait until you’ve got all the information, then feed it into the machine learning platform, and then go round again – that should be more efficient. That’s going to be interesting for us to see – it imparts a slightly different way of working on us. But a lot of the other framework – how we test the molecules and everything else – that hasn’t changed.”

Time is money

As previously mentioned, hand in hand with the time savings and boosts to efficiency comes cuts to costs as well. To this end, the AI and machine learning technology has found one of its biggest champions in Professor Sir John Bell, one of the UK’s foremost scientific figures and author of the recent Life Sciences: Industrial Strategy, a plan for the industry in the UK to navigate the choppy waters of Brexit in the coming years. Professor Bell has extolled the range of benefits that AI can bring to a number of organisations, most notably the UK’s National Health Service, going as far as to say the cost savings it would generate – estimated by Bell to £1.1 billion, or 50%, of the service’s pathology spend – “may be the thing that saves the NHS”.

Indeed, at the beginning of this year, a team of researchers at Oxford’s John Radcliffe Hospital unveiled Ultromics, their AI system which they argue can save NHS hospitals billions in operational costs. The technology is designed for the early detection of heart disease, an area which costs the NHS £600 million a year due to inaccuracies in current monitoring methods that sees 12,000 misdiagnoses from a total figure of 60,000 every year.

 “Making a diagnosis from echo relies on experienced clinicians having to make qualitative judgements based on only a fraction of the data that is potentially available to them from a typical scan. Our technology extracts more than 80,000 data points from a single echocardiogram image to overcome subjectivity and increase diagnostic accuracy,” the Ultromics websites states.

We may just see whether Bell’s hypothesis comes true – Ultromics is due to be rolled out in NHS hospitals for free as early as the summer of 2018. It is thought that it could save the service up to £300 million a year.

Reducing nasty surprises

Of course, as both Green and Swindells are keen to point out, these cost-savings are not just the interest of national health organisations – the financial benefits the technology can potentially provide to pharma cannot be understated.

Alongside savings to costs and time, platforms such as Exscientia’s also provide an efficient way to process another of today’s most talked about tools: big data. Green explains how GSK is leveraging Exscientia’s technology to harness this and reap even more rewards at the drug development stage.

“When you’re developing a medicine, you want it interact in the pathway or target that you’re interested in for the disease – you don’t want it to be hitting other things,” he remarks. “Exscientia build on public domain data and data provided by commercial content providers, and they’ve built a whole battery of predictive models for all these off-targets, and that’s used in the design. You can imagine it’s very difficult for a medicinal chemist in the lab who’s trying to design the next molecule to look across 3,000 proteins and data sets to think: ‘Will this molecule I just designed interact with all of those things?’

“The way to deal with that is to build predictive models, and then to apply those models. In Exscientia’s case, as the machine is generating candidate structures, they can score it against these off-targets. One of the things we want to learn in our collaboration with them is: ‘Does that actually translate to things that don’t have these nasty surprises? We thought it was very specific, but when we put it into our proteomics studies its hitting these targets we didn’t expect.’”

Bigger potential, bigger challenges

It’s a somewhat strange thought given how long the technology has been around, but AI is still in its infancy, with much more potential still to be realised in the future. As has been explored, there is so much value in AI and machine learning in the short-term, but what about into the farther future?

“I think it only gets bigger,” said Green. “I can see us investing more. I think we’ll still have a very strong laboratory base operation. Drug discovery is a very humbling business – you fail for things you just can’t predict. What we’re hoping is that we’ll lock out the things we could have predicted.”

Swindells added: “I think that, provided you can design a very specific problem that should be answerable, with all the elements such as the data, computation and the algorithm, then AI is going to be applicable in a broad range of areas. What people often discount though is the importance of the domain expertise in deciphering a problem. For us, we all worked in a computational background, but we also married that with a deep knowledge of drug discovery. Our Chief Chemist Andy Bell was a co-inventor of Viagra and also played a key role in getting the life-saving antifungal drug voriconazole to market; it’s how to balance the depth of expertise on the medicine and chemistry side, with a deep knowledge of the problem itself.

“I have no doubt that in other sectors of pharma and life sciences, people who really understand their specific sector will be able to generate real progress,” he continued. “What I’m more cautious about is people who just think AI will solve any problem if you throw enough computers at it. Even if you look back at the DeepMind Go paper, you will find that almost all of the first four authors were top-ranking amateur Go players – they didn’t just come in having read a magazine about a game called Go that they’d never heard of. Imbuing the technology with the experience is key so that it can actually be designed so that it does then actually answer real-life questions.”

Just as Swindells says, the road to realising the widespread benefits of AI and machine learning will of course not always be smooth, and The Pistoia Alliance’s survey identified a number of challenges including problems with access to data and the quality of said data. However, the survey found that perhaps the most significant barrier to greater adoption and efficient use of AI is the availability of technical expertise, with 30% of respondents recognising it as the single greatest hurdle to overcome, corroborating Swindells’ point.

Green agrees, theorising that data literacy and expertise may need to be instilled at an early stage to ensure the relevant individuals are properly equipped to leverage the technology effectively: “There’s talk of data science becoming increasingly important; people coming out with life sciences qualifications and backgrounds, even if they’re mostly laboratory-based, will have to be much more data literate and be capable of understanding how to use the machine learning and AI to best effect, and how to couple that with human thinking.”

It is also worthy of note that a sizeable 11% of those asked are not currently utilising AI and 30% are not using machine learning in any capacity, with 8% going as far as to admit they knew “next to nothing” about them. This illustrates quite clearly that, regardless of a promising level of uptake of the technology, there is still a significant obstacle to be overcome in cultivating a level of skill and knowledge to meet and unlock the potential it promises.

“AI has the potential to revolutionise life sciences and healthcare, all the way from early preclinical drug discovery to selecting precision treatments for individual patients,” commented Dr Steve Arlington, President of The Pistoia Alliance. “Our survey data shows that while life science professionals are already exploring how AI, machine learning and neuro-linguistic programming can be used; there are clear gaps in the knowledge, data, and skills, which will enable more pharma and biotech companies to achieve tangible results from AI. Impediments to success, such as a lack of industry-wide standards for data format, will need to be addressed, if the potential of AI and machine learning is to be realised. We urge those in the pharmaceutical, biotechnology and technology industries to explore ways in which they can collaborate now, to find answers to common problems of the future.”

Related Content


GSK shares results from phase 3 trial for Blenrep as multiple myeloma treatment

GSK has announced positive results from a planned interim efficacy analysis of the phase 3 …


Genentech and NVIDIA enter AI research collaboration

Genentech has announced that it has entered into a multi-year strategic research collaboration with NVIDIA …

GSK’s low carbon inhaler, Ventolin, progresses to phase 3 trials 

GSK has announced that in 2024 it intends to start phase 3 trials of a …

Latest content