It is just few days that Apple in presenting its Apple Watch pointed out a feature that makes the wristband vibrate in two different ways providing the wearer an indication that she has to turn right (or left) when the watch is playing the ole of a navigation system. No need to look at the map on the (tiny) screen, just feel your wrist and follow the hints.
Now I stumbled onto this news from the University of Cincinnati where a researchers studied the value of vibration hints to create a 3D visualisation of the space in front of a person. He has developed a sort of torch (but the plan is to shrink it and make it possible to have it embedded in fabric so that it can be worn) that detects obstacles in front of the person and convert this information in specific vibration of a wristband.
This system, he reasoned, would be useful to a person with impaired vision.
He carried out several experiments both with vision impaired and with people with a normal vision. Interestingly, he found out that both the visually impaired and the ones with normal vision (blindfolded during the experiment) had no problem in using the information provided through vibration.
This promises to be a good help for people needing to navigate a space, and it may turn out to be useful also for people with normal sight having to move in a dark space.
I suspect, and in a way the embedding of a vibration based signalling in the Apple Watch is a confirmation, that we are going to "see" a significant evolution in the application of haptic interfaces in the coming years, leading to new ways of interacting with the environment.