During the past 60 years scientific research proposed many techniques to control robotic hand prostheses with surface electromyography (sEMG). Few of them have been implemented in commercial systems also due to limited robustness that may be improved with multimodal data. This paper presents the first acquisition setup, acquisition protocol and dataset including sEMG, eye tracking and computer vision to study robotic hand control. A data analysis on healthy controls gives a first idea of the capabilities and constraints of the acquisition procedure that will be applied to amputees in a next step. Different data sources are not fused together in the analysis. Nevertheless, the results support the use of the proposed multimodal data acquisition approach for prosthesis control. The sEMG movement classification results confirm that it is possible to classify several grasps with sEMG alone. sEMG can detect the grasp type and also small differences in the grasped object (accuracy: 95%). The simultaneous recording of eye tracking and scene camera data shows that these sensors allow performing object detection for grasp selection and that several neurocognitive parameters need to be taken into account for this. In conclusion, this work on intact subjects presents an innovative acquisition setup and protocol. The first results in terms of data analysis are promising and set the basis for future work on amputees, aiming to improve the robustness of prostheses with multimodal data. © 2017 IEEE.
Megane Pro: Myo-electricity, visual and gaze tracking data acquisitions to improve hand prosthetics / Francesca, Giordaniello; Matteo, Cognolato; Graziani, Mara; Gijsberts, Arjan; Gregori, Valentina; Gianluca, Saetta; Anne-gabrielle Mittaz Hager, ; Cesare, Tiengo; Franco, Bassetto; Peter, Brugger; Caputo, Barbara; Henning, Müller; Manfredo, Atzori. - (2017), pp. 1148-1153. (Intervento presentato al convegno 2017 International Conference on Rehabilitation Robotics, ICORR 2017 tenutosi a London; United Kingdom) [10.1109/ICORR.2017.8009404].
Megane Pro: Myo-electricity, visual and gaze tracking data acquisitions to improve hand prosthetics
GRAZIANI, MARA;Arjan Gijsberts;GREGORI, VALENTINA;Barbara Caputo;
2017
Abstract
During the past 60 years scientific research proposed many techniques to control robotic hand prostheses with surface electromyography (sEMG). Few of them have been implemented in commercial systems also due to limited robustness that may be improved with multimodal data. This paper presents the first acquisition setup, acquisition protocol and dataset including sEMG, eye tracking and computer vision to study robotic hand control. A data analysis on healthy controls gives a first idea of the capabilities and constraints of the acquisition procedure that will be applied to amputees in a next step. Different data sources are not fused together in the analysis. Nevertheless, the results support the use of the proposed multimodal data acquisition approach for prosthesis control. The sEMG movement classification results confirm that it is possible to classify several grasps with sEMG alone. sEMG can detect the grasp type and also small differences in the grasped object (accuracy: 95%). The simultaneous recording of eye tracking and scene camera data shows that these sensors allow performing object detection for grasp selection and that several neurocognitive parameters need to be taken into account for this. In conclusion, this work on intact subjects presents an innovative acquisition setup and protocol. The first results in terms of data analysis are promising and set the basis for future work on amputees, aiming to improve the robustness of prostheses with multimodal data. © 2017 IEEE.File | Dimensione | Formato | |
---|---|---|---|
Giordaniello_Megane-Pro_2017.pdf
solo gestori archivio
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
881.14 kB
Formato
Adobe PDF
|
881.14 kB | Adobe PDF | Contatta l'autore |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.