The paper reports the performance of a low-cost markerless system for 3D human motion detection and tracking, consisting of the open-source library OpenPose, two webcams and a linear triangulation algorithm. OpenPose is able to identify anatomical landmarks with a commercial webcam, using Convolutional Neural Networks trained on data obtained from monocular images. When images from at least two different points of view are processed by OpenPose, 3D kinematic and spatiotemporal data of human gait can be also computed and assessed. Despite its potential, the accuracy of such a system in the estimation of kinematic parameters of human gait is currently unknown. With the aim to estimate OpenPose accuracy in 3D lower limb joint angle measurement during gait, two synchronized videos of a healthy subject were acquired, with two webcams, in a walking session on a treadmill at comfortable speed. 2-dimensional joint centers coordinates were assessed by OpenPose, and computed in 3D by triangulation algorithm. The resulting angular kinematics was, then, compared with inertial sensors outputs. Results showed that the system was generally able to track lower limbs motion, producing angular traces representative of normal gait similar to the ones computed by IMUs. However, OpenPose approach showed inaccuracy, mostly in the computation of maxima and minima joint angles, reaching error values up to 9.9°.

A markerless system for gait analysis based on OpenPose library / D'Antonio, E.; Taborri, J.; Palermo, E.; Rossi, S.; Patane, F.. - (2020), pp. 1-6. (Intervento presentato al convegno 2020 IEEE International Instrumentation and Measurement Technology Conference, I2MTC 2020 tenutosi a hrv) [10.1109/I2MTC43012.2020.9128918].

A markerless system for gait analysis based on OpenPose library

Palermo E.;
2020

Abstract

The paper reports the performance of a low-cost markerless system for 3D human motion detection and tracking, consisting of the open-source library OpenPose, two webcams and a linear triangulation algorithm. OpenPose is able to identify anatomical landmarks with a commercial webcam, using Convolutional Neural Networks trained on data obtained from monocular images. When images from at least two different points of view are processed by OpenPose, 3D kinematic and spatiotemporal data of human gait can be also computed and assessed. Despite its potential, the accuracy of such a system in the estimation of kinematic parameters of human gait is currently unknown. With the aim to estimate OpenPose accuracy in 3D lower limb joint angle measurement during gait, two synchronized videos of a healthy subject were acquired, with two webcams, in a walking session on a treadmill at comfortable speed. 2-dimensional joint centers coordinates were assessed by OpenPose, and computed in 3D by triangulation algorithm. The resulting angular kinematics was, then, compared with inertial sensors outputs. Results showed that the system was generally able to track lower limbs motion, producing angular traces representative of normal gait similar to the ones computed by IMUs. However, OpenPose approach showed inaccuracy, mostly in the computation of maxima and minima joint angles, reaching error values up to 9.9°.
2020
2020 IEEE International Instrumentation and Measurement Technology Conference, I2MTC 2020
Gait analysis; Intertial sensors; Markerless; Motion capture; Neural network
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
A markerless system for gait analysis based on OpenPose library / D'Antonio, E.; Taborri, J.; Palermo, E.; Rossi, S.; Patane, F.. - (2020), pp. 1-6. (Intervento presentato al convegno 2020 IEEE International Instrumentation and Measurement Technology Conference, I2MTC 2020 tenutosi a hrv) [10.1109/I2MTC43012.2020.9128918].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1452717
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 49
  • ???jsp.display-item.citation.isi??? ND
social impact