Our connection to the world is quite sophisticated.
Our brain integrates the stimuli coming from our senses, including, most importantly, the ones from the proprioceptors that provide motion information.
The brain also "checks" that the integration makes sense and if not it gets "dizzy"! This is the case when we are a passenger on a car and read a book. The eye are reporting images consistent with standing still (no motion) since we are seeing a page that is still. At the same time our proprioceptors report acceleration (left and right, mostly, but also increase and decrease of speed as the car accelerates or brakes). This information is not consistent, based on brain experience you cannot be still and moving at the same time (thanks to evolution where for million of years that was not possible). Some people are more sensitive than others (more prone to motion sickness). Also, the brain can learn and get used to these "unusual" situations.
With Virtual Reality the brain is faced with the same problem, although in the reverse order: our eyes report motion whilst our proprioceptors report stillness. The better Virtual Reality is, the worse the problem.
Researchers at Columbia University have discovered a way to minimise the problem by tricking the brain by providing different perspectives from the left and right eye.
They have discovered that by restricting the field of view of one eye as the images are creating the impression of motion the brain becomes less susceptible to dizziness (watch the first clip).
This approach seems to work, however it is not a real solution. It would be much better if a VR system could provide a full, coherent, slate of sensory information. That is send to the brain motion sensations from the proprioceptors. It would seem impossible, at first glance, unless you are going to experience the VR in a sort of vehicle that through simulated motion can affect your proprioceptors. This is what is being done in professional flight simulators, the ones used to train pilots. These are very complex and very expensive systems that for sure we cannot think of having back home.
Here comes vMotion, a company that is using the discovery made at Mayo Clinics on the effects of Galvanic Vestibular Stimulation, GVS. A good portion of our sense of motion and of our relative position in the ambient comes from the sensations transmitted to the brain by the vestibule (in our ears). As you move your head the motion is captured by sensors in the vestibule and transmitted to the brain. The brain use this information to create a stable vision of the world. Try looking at an object in front of you and move your head from one side to the other, also try to incline your head. No matter what you do, the brain keeps the object still in (virtual) your field of vision even though the object position moves on the retina.
At Mayo, scientists have discovered that by applying an electrical (galvanic) stimulation to the skin on the back of the ear you can trick the vestibular sensors in detecting a non-existent motion (watch the second clip). By carefully controlling this stimulation you can have the vestibular sensors reporting a motion that is in synch with the images shown by the VR system.
This approach should provide a better sensation and should further decrease dizziness when using VR systems. On the downside whilst the Columbia approach can be embedded in the VR system itself, the GVS requires an additional equipment.
Samsung has announced a set of headphones, Entrim 4D, that exploits GVS to provide a senso of motion (watch the third clip).
Notice that although quite sophisticated (and effective) GVS does not represent a complete solution since we have many more motion receptors spread in our bodies (basically every joint sends motion and tension information to the brain). Affecting only the vestibular sensors is not enough. Yet, it is a significant step forward.
It is interesting to see how entangled is ICT with bioengineering when the goal is to communicate with our brain. This is an area that is being addressed by the Digital Sense Initiative at IEEE.