Uhm, you're too happy to drive!

Image recognition is moving towards understanding the mood of that "face". Credit: EPFL/Jamani Caillet

Detection of stress and quantifying it. Credit: EPFL

Image recognition keeps making progress. Face recognition is already a commodity we find on mass market point and shoot camera that recognise the faces and detect when a smile appear to take the picture.

We have a number of automatic tagging application that looking at certain ratio among facial characteristics can recognise a person in several photos, even if that person appear in one shot with a beard and in another without, in one with blond hair and in another with black hair.

Researchers from different fields are teaming up to extract more information out of an image. The branch of affective computing is particularly interested in understanding how a person feel during an interaction and this has led to a good capability of picking up "moods" out of the observation of a face (both as it appears in a photo but even more as its expression evolves in a clip).

Researchers at the EPFL in Lausanne in collaboration with PSA Peugeot Citroen have decided to apply to the driving safety area the possibility of identifying 7 basic (and universal) emotions displayed by a face: fear, anger, joy, sadness, disgust, surprise, suspicion.

Now the question is if it is unsafe to drive if you are happy or it would be more dangerous to drive when you feel blue...

According to psychologists it is unsafe to drive if you are irritated, whilst being in the pink or feeling blue has no relation to the safety of driving. So the problem the research team had to "face" (no pun intended) is to recognise irritation by looking at facial expression and this is indeed a problem because different people may show irritation in different ways (there is no universal mimic for irritation).

The researchers started by studying two recognisable emotion, anger and disgust and worked on finding an appropriate, and non intrusive way, to capture expressions within a car. They ended up using an infrared camera embedded in the steering wheel. By looking at several drivers and margining information derived from both static photo and clips they have been able to evaluate the level of stress of a person and correlate it to the level of irritation. 

There is much more that can decrease safety as you drive. Being drowsy is not going to help for sure. They found out that the level of drowsiness can be evaluated by looking at droopiness of your eyelids, by the tiny movement of your head and neck, by the tone of your voice...

Actually, they feel that it is only through a process of multiple hints analyses and learning the specific of each driver that the system may result in an accurate prediction of risk.

Of course, once you have that assessment you can inform the driver but the one million dollar question is: will the driver stop driving because a machine tells him it is unsafe to do so? Probably unlikely. Would the system go as far as blocking the car? Would we be forced to wear a mask!?

Would the regulator eventually impose this safety device on the car, as it has imposed the air bag and will a black box on the car register your emotions and let your insurance company look into those data? Wow Wow! Privacy!!!

Technology is taking us onto unchartered path where most of the issue are not technology related.

At the EIT ICT Labs we are starting a High Impact Initiative to improve truck drivers safety by looking at their physical condition as they drive. I suspect that if we are going to be (technologically) successful we will open up a can of worm...

Author - Roberto Saracco

© 2010-2018 EIT Digital IVZW. All rights reserved. Legal notice. Privacy Policy.