The progress in signal processing, applied to both image processing and voice analyses, has reached a point where it is possible to detect emotions (or lack thereof) and this possibility opens the door to a variety of (good) applications as well as raising some perplexities.
As an example, I was captivated by an ongoing research at Politecnico of Milan, one of our EIT Digital partner, where a researcher in the team of Franca Garzotto, Mirko Gelsomini, with some colleagues have developed a system to help analysts in early detection of social disabilities in toddlers and young children and provide support to improve the kids behavior.
The system uses teddy bears, I was shown a soft, stuffed, elephant by Giochi Preziosi, (see photo) with an embedded camera smartly hidden so not to be noticed by the kid, communicating with the analyst's computer. The elephant has a set of motors to animate its trunk, ears and eyes, and a loudspeaker to trumpet. Mirko developed the controller (based on a RasberryPi) and the applications providing the interface to the analyst.
The analyst may be in a different room (the elephant is wirelessly connected) and monitors the interactions of the child with the elephant and can direct the "actions" of the elephant analysing the "reaction" of the child.
At MIT another team has developed a wearable app that can be embedded into a smart phone (see photo) that listen to stories being told (another people talking to you or at what you are telling). By analysing the tone in real time it detects telltale signs of emotions behind the talking.
Their goal is to serve the owner in better understanding his own emotions, rather than understanding the emotion of the person he is talking to, thereby helping people with difficulty in detecting their own emotional state, like Asperger or other forms of autism. In some cases, however, it may also help a person with these relationship disabilities to see the emotions of people he is interacting with.
All this is good and it is clear that there can be benefit to be derived from these technologies. On the other hand the "Big Brother" shadow looms even larger. With these technologies it becomes possible to look inside people, into their emotions. I can imagine advertisers drooling at the possibility of detecting the emotions raised by an ads and customise it in real time to generate the desired response... It is no longer science fiction. A last generation television with the embedded video camera could capture the expressions and tone of voices of people in the living room and an app could detect emotions and direct changes in the ads being shown, at the local level, inside the television so that different televisions in different homes with different audiences will be showing the same ads rendered in different ways to maximise the impact.