Controlling a robot with your mind...

Research subjects at the University of Minnesota fitted with a specialized noninvasive EEG brain cap were able to move a robotic arm in three dimensions just by imagining moving their own arms. Credit: University of Minnesota College of Science and Engineering

Using implanted electrodes to pick up electrical activity in the brain from very narrow regions simplifies the decoding of the signals. Here an example of an implanted electrode array, allowing for a patient to control a robot arm with her thoughts Credit: UPMC

Several progresses have been made in decoding the electrical activity generated by/in our brain when directing our body to perform a specific action. Researchers have been able to "read the mind", in a way our thoughts, as it gets ready to order our muscles to perform a certain action, like "pick up an object".

This decoding is extremely complex and nothing short of magic. There is so much going on in the brain and isolating a specific electrical pattern that is associated to a specific "thought" is quite challenging.

The detection of the electrical signals is the first step and so far this has required the implant of electrodes in specific part of the brain that control specific muscles. This greatly simplifies the isolation of relevant signals from the "background" noise.

Now researchers at the University of Minnesota have been able to isolate these signals using non invasive electrical detection. A head cap with electrodes was used to pick up the electrical activity. A person wearing it thinks of moving her arm and hand to pick up an object and placing it on a shelf. The signals are decoded and a robot is commanded to do the action.  Clearly the first part, the decoding, is the real though one. Controlling a robot to perform a specific action, once you know what your goal is, is a piece of cake.

The feat, a world's first, has been performed using a cap with 64 electrodes. The electrical activity that is captured by the cap is analysed using advanced signal processing and machine learning. This "complement" the learning that takes place in the brain of the person using the BCI (Brain Computer Interfaces). The brain, seeing the result of its "thinking" changes its activity and learns from the changes in the robot actions, till it manages to control the robot in the desired way. Now this adaptation is mutual, being also performed through machine learning.

The next step for the researchers will be to have the brain controlling an exoskeleton attached to a paralysed arm to restore its functionality.

We can expect more and more progress in the coming decades with the goal of fully restoring functionalities in paralysed patients by the middle of this century.

Author - Roberto Saracco

© 2010-2018 EIT Digital IVZW. All rights reserved. Legal notice