The instability of myoelectric signals over time complicates their use to control poly-articulated prosthetic hands. To address this problem, studies have tried to combine surface electromyography with modalities that are less affected by the amputation and the environment, such as accelerometry and gaze information. In the latter case, the hypothesis is that a subject looks at the object he or she intends to manipulate, and that the visual characteristics of that object allow to better predict the desired hand posture. The method we present in this paper automatically detects stable gaze fixations and uses the visual characteristics of the fixated objects to improve the performance of a multimodal grasp classifier. Particularly, the algorithm identifies online the onset of a prehension and the corresponding gaze fixations, obtains high-level feature representations of the fixated objects by means of a Convolutional Neural Network, and combines them with traditional surface electromyography in the classification stage. Tests have been performed on data acquired from five intact subjects who performed ten types of grasps on various objects during both static and functional tasks. The results show that the addition of gaze information increases the grasp classification accuracy, that this improvement is consistent for all grasps and concentrated during the movement onset and offset.

Visual Cues to Improve Myoelectric Control of Upper Limb Prostheses / Gigli, Andrea; Gregori, Valentina; Cognolato, Matteo; Atzori, Manfredo; Gijsberts, Arjan. - (2018), pp. 783-788. (Intervento presentato al convegno 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob) tenutosi a Enschede; Netherlands) [10.1109/BIOROB.2018.8487923].

Visual Cues to Improve Myoelectric Control of Upper Limb Prostheses

Gregori Valentina
;
Gijsberts Arjan
2018

Abstract

The instability of myoelectric signals over time complicates their use to control poly-articulated prosthetic hands. To address this problem, studies have tried to combine surface electromyography with modalities that are less affected by the amputation and the environment, such as accelerometry and gaze information. In the latter case, the hypothesis is that a subject looks at the object he or she intends to manipulate, and that the visual characteristics of that object allow to better predict the desired hand posture. The method we present in this paper automatically detects stable gaze fixations and uses the visual characteristics of the fixated objects to improve the performance of a multimodal grasp classifier. Particularly, the algorithm identifies online the onset of a prehension and the corresponding gaze fixations, obtains high-level feature representations of the fixated objects by means of a Convolutional Neural Network, and combines them with traditional surface electromyography in the classification stage. Tests have been performed on data acquired from five intact subjects who performed ten types of grasps on various objects during both static and functional tasks. The results show that the addition of gaze information increases the grasp classification accuracy, that this improvement is consistent for all grasps and concentrated during the movement onset and offset.
2018
2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob)
Prosthetics; Artificial intelligence; Data acquisition; Prosthetics; Robotics
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Visual Cues to Improve Myoelectric Control of Upper Limb Prostheses / Gigli, Andrea; Gregori, Valentina; Cognolato, Matteo; Atzori, Manfredo; Gijsberts, Arjan. - (2018), pp. 783-788. (Intervento presentato al convegno 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob) tenutosi a Enschede; Netherlands) [10.1109/BIOROB.2018.8487923].
File allegati a questo prodotto
File Dimensione Formato  
Gigli_Visual-Cues_2018.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 806.52 kB
Formato Adobe PDF
806.52 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1321415
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? ND
social impact