Scientists at InSilico Medicine, Inc, (the name says it all) are using Deep Neural Networks to understand the potential effects of drugs on people. Normally this would require very long testing phases, first using in vitro experiment, then moving to animals and then to people. The process takes at least ten years, often more.
The problem faced by scientists is really hard. There are so many parameters that need to be taken into account. The progress being made in genetics opens up new understanding but it is also increasing the complexity since now it becomes possible to study the effects at molecular level.
One has to look into the genome, into the transcriptome (the way genes are expressed), the proteome (the set of proteins involved) and this creates an extremely complex space (it is not just complicated because it has millions of parameters, it is complex because of the mutual interactions among these parameters).
This is where Deep Neural Networks (DNN) come handy. Managing the complexity and extracting meaning out of it.
The scientists at InSilico Medicine have trained the DNN using 678 drugs and have been able to derive some 800 strong hypotheses of effects in fields like oncology, Central Nervous System, cardiology and metabolic pathologies.
The hypotheses have been checked through usual experimentation methods and showed a 54.6% accuracy in prediction across 12 therapeutic classes (this is very good indeed: keep in mind that nothing is certain in biology and a random outcome would have scored less than 1%).
What is also very important is that this demonstration of the capability of DNN in pointing researchers (and Pharma) in the right direction is not just slashing the time of creation of a new drug (clinical trials will not be superseded any time soon), it is also opening the way to a future where drugs will be customized, based on personal genome. This may start to happen by the end of the next decade as more and more people will have their genome sequenced.
In general, I see this news as another proof of the shift from atoms to bits. We use sensors to transpose atoms into bits and then we operate on those bits, more effectively (lower cost and at higher speed, in this case leveraging on the fact that bits can be duplicated and each copy –exactly the same as the original- can undergo scrutiny and processing in parallel with all the others) and the outcome could be translated into action on atoms.
Another brick in the construction of the Data Economy house.