We have amazingly accurate tools to detect cancer cells in a biologic sample. The problem is that they are quite expensive and require skilled doctors to make the analyses. This make them basically useless in poor Countries, both for cost and skill.
This is where the work carried out at Harvard comes handy.
Researchers are exploiting smartphone processing and display capability to provide a cheap tool to practitioners in poor Countries to diagnose a wide variety of cancers.
These practitioners are trained to retrieve biological samples from patients (e.g. exfoliating cells from the cervix) and place them into a container with a solution of beads that selective binds to cancer cells. The container is inserted in a specifically built add-on fitting on a normal smartphone. The add-on contains a chip generating light that is beamed through the sample. Cancer cells, that have bound to the beads, create a detectable diffraction pattern. More than that. The diffraction pattern provides information on the number of cells bound to the beads, hence providing an indication on the number of cancerous cells.
The processing of the diffraction pattern is quite complex but it can be carried out in the Cloud. The application on the smart phone automatically transmit the image taken by the smartphone camere with the diffraction pattern. A normal smartphone camera has such a resolution that is able to accurately represent the diffraction patterns in such detail that it can single out up to 100,000 cells.
The analyses performed in the cloud is sent back to the smartphone and displayed on its screen for an easy interpretation by the practitioners.
What is really interesting is the interplay of different technologies resulting in accurate detection at very low cost: 1.5$ per analyses (where most of the cost is related to the reagent needed to bind the beads to the cancer cells), and the expectation of further decrease in price in the coming years (1.5$ is still a lot of money in many Countries).