Robot Arm for Paralysed People

To control the robot arm, the Brain-Computer Interface (BCI) developed at Fraunhofer FIRST is combined with an eyetracker, which first of all determines the direction in which the arm should move. For this purpose, the direction of the patient’s gaze is monitored by two cameras mounted on a specially designed pair of glasses.

The actual movement of the robot arm is triggered by a signal from the Brain-Computer Interface. Researchers at Fraunhofer FIRST and Berlin’s Charité have been working for some seven years now on the development of the Brain-Computer Interface using a conventional electroencephalogram (EEG) of the sort employed in routine clinical procedures. Electrodes attached to the patient’s scalp measure the brain’s electrical signals. These are then amplified and transmitted to a computer.

High-efficiency algorithms analyze the signals using machine-learning methods. They are capable of detecting changes in brain activity triggered by the purely mental conception of a particular behaviour. They can, for instance, unequivocally identify patterns reflecting the idea of moving the left or right hand and extract them from the many millions of neural impulses. They are then converted into control commands for the computer.

In the Brain2Robot project, the pattern reflecting the idea of moving the right hand is used to set the robot arm in motion. The signal for the left hand triggers a certain action of the arm, e.g. grasping or lifting a coffee cup. Team leader Florin Popescu defines the project’s goals as follows: “The project is designed to help severely handicapped people cope with everyday life. The advantage of our technology is that it can directly convert intended movements into control commands for the computer.”

In developing the Brain2Robot system, the focus has been on medical applications, in particular the control of prostheses, supportive robots or wheelchairs.; Source: Fraunhofer FIRST