I posted few weeks ago some thoughts on the progress of computers in recognizing objects in an image, in particular I mentioned the image recognition advances made by Google.
Now I read a paper written by University of Rochester researchers reporting on the progress made by computers (applications) in recognising emotions present in an image, a person having a grudge, feeling blue or in the pink.
The researchers have used Convolutional Neural Networks (CNN) to extract the emotions from an image. CNN have been inspired by observing the brain and the connectivity existing among various neutrons processing an image. Each set of neurones analyses a fragment of the image and provides feedback to neurones analysing other fragments. This "forward feeding mechanisms" has been proven effective in contextualising the relations among the various parts of an image. In effect, this is important, since an emotion is visually derived by observing all elements in an image. The very same expression can mean surprise, astonishment, fear, happiness... and to tell one from the other you need to understand the relations among the various part of the image.
The quest for emotions detection has grown in the last few years as more and more communications involve images. Social media, even twitter now, are rich in visual images. Studying market "feeling" or the intention of vote at a political election has been leveraging textual messages. With the explosion of images it is clear the interest in decoding their meaning.
And, by the way, as computers will get smarter in understanding our emotions, our privacy will be challenged even more!