Using the brain as a keyboard for a computer

System diagram of the advanced SSVEP-based BCI speller. It consists of four main procedures: visual stimulation, EEG recording, real-time data processing, and feedback presentation. The 5 × 8 stimulation matrix includes the 26 letters of the English alphabet, 10 numbers, and 4 symbols (space, comma, period, and backspace). Credit: Xiaogang Chen et al./PNAS

The advances in Brain Computer Interfaces are visible in niches like prosthetics for completely paralysed people that can only move their eyes. For them BCI is used to capture the electrical signals generated by the person looking at a character on a screen and decoding that by a computer to identify the character.

One character after the other and words and then sentences are formed that the computer can visualise, print or speak up with text to speech programs.

The problem is the time it takes a computer to recognise the electrical signal generated by the brain. This signal is mixed up with thousands of others resulting from parallel working of other neurones. Hence making the detecting the letter a person is looking at is a lengthy process. 

Here comes the news of a solution found by researchers in China, at Tsinghua University in cooperation with another team at State Key Laboratory Integrated Optoelectronics, Chinese Academy of Science. 

They have discovered that a letter blinking on a screen generates waves in the brain that are at the same or at a multiple frequency. This discovery has led to create a matrix of letters on the screen where each letter "blinks" at a specific frequency, from 3.5 to 75Hz. Hence, when the person looks at a specific letter her brain generates waves at that specific frequency (or a multiple of it). This is much easier to detect and indeed they report of experiments showing a reading/decoding speed of about 60 characters per minute, one letter per second. This is absolutely the fastest communications achieved so far by BCI. It is actually close to the speed of our eyes in moving from one character to another and lingering for enough time to single it out from neighbouring ones.

It might seem still pretty slow but everything is relative. I am pretty sure that even small improvements are most appreciated by people with this disability and it is just good to see that technology can help a little bit.

Of course a much better solution would be to be able to read the thoughts of the person, at the speed of thought, and only the ones that person wants to share... but we are still quite far from that.

Author - Roberto Saracco

© 2010-2019 EIT Digital IVZW. All rights reserved. Legal notice. Privacy Policy.

EIT Digital supported by the EIT