Research and innovation
- 2 minutes
With the autobiographical book "The Diving Bell and the Butterfly," French journalist Jean-Dominique Bauby brought the condition of a person with locked-in syndrome to public attention. That volume composed letter by letter of more than 200,000 eyelid blinks anchored in the imagination the use of the eyes as a means of communication for those patients unable to interact with the outside world.
Since the publication of Bauby's autobiography in 1997, there has been an acceleration in technological development that has also involved studies in the field of human-robot interaction. This includes the thesis work Brain-computer interface for robot control with eye artifacts for assistive applications, published in Scientific Reports, which also involved a SUPSI researcher. In addition to Kaan Karas, Luca Pozzi, Alessandra Pedrocchi, and Francesco Braghin (master's student, doctoral student, and professors at the Politecnico di Milano, respectively), Loris Roveda, SUPSI senior researcher at the Dalle Molle Institute for Artificial Intelligence (IDSIA USI-SUPSI), contributed to the study for the methodological and conceptual part.
This paper presented a novel brain-computer interface (BCI) that allows an assistive robot to be controlled through electroencephalogram (EEG) signals. By detecting impulses generated by eye movements, the BCI enables a person with disabilities to interact with his or her physical surroundings.
A practical demonstration of how the new BCI works.
In addition to the use of the helmet through which the impulses of brain activity were recorded, the interface represents a novelty for the use of two different algorithms capable of identifying and isolating precise intentional signals of eye movements (rightward, leftward gaze and blinking) among other data detected by the electrodes. A first algorithm was responsible for identifying regular activities, while the second was trained to recognize weaker signals. This dual-threshold method differs from other algorithms in the literature so far.
Once developed, the two algorithms were trained offline to acquire data and improve pulse recognition performance, and then integrated into an online interface that enabled real-time communication, in which the researchers were able to have an assistive robot perform tasks (pick up an object from a table).
There are other tools in which the eye enables usable information, such as eye-tracking, but the electroencephalography helmet is less susceptible to environmental conditions (e.g., low light) and does not require neck control to function properly; an eye-tracker must remain on axis to get good results.
The work published in Scientific Reports aimed to explore the possibilities of using EEG to provide tools to enable greater autonomy for patients unable to have other types of interaction with the robot.
A novel way to enable with the eyes to scratch, at least in part, the thick armor of a scaph.