We present a novel framework for the automatic discovery and recognition of motion primitives in videos of human activities. Given the 3D pose of a human in a video, human motion primitives are discovered by optimizing the ‘motion flux’, a quantity which captures the motion variation of a group of skeletal joints. A normalization of the primitives is proposed in order to make them invariant with respect to a subject anatomical variations and data sampling rate. The discovered primitives are unknown and unlabeled and are unsupervisedly collected into classes via a hierarchical non-parametric Bayes mixture model. Once classes are determined and labeled they are further analyzed for establishing models for recognizing discovered primitives. Each primitive model is defined by a set of learned parameters. Given new video data and given the estimated pose of the subject appearing on the video, the motion is segmented into primitives, which are recognized with a probability given according to the parameters of the learned models. Using our framework we build a publicly available dataset of human motion primitives, using sequences taken from well-known motion capture datasets. We expect that our framework, by providing an objective way for discovering and categorizing human motion, will be a useful tool in numerous research fields including video analysis, human inspired motion generation, learning by demonstration, intuitive human-robot interaction, and human behavior analysis.

Discovery and recognition of motion primitives in human activities / Sanzari, Marta; Ntouskos, Valsamis; Pirri, Fiora. - In: PLOS ONE. - ISSN 1932-6203. - 14:4(2019). [10.1371/journal.pone.0214499]

Discovery and recognition of motion primitives in human activities

Sanzari, Marta
Co-primo
;
Ntouskos, Valsamis
Co-primo
;
Pirri, Fiora
Co-primo
2019

Abstract

We present a novel framework for the automatic discovery and recognition of motion primitives in videos of human activities. Given the 3D pose of a human in a video, human motion primitives are discovered by optimizing the ‘motion flux’, a quantity which captures the motion variation of a group of skeletal joints. A normalization of the primitives is proposed in order to make them invariant with respect to a subject anatomical variations and data sampling rate. The discovered primitives are unknown and unlabeled and are unsupervisedly collected into classes via a hierarchical non-parametric Bayes mixture model. Once classes are determined and labeled they are further analyzed for establishing models for recognizing discovered primitives. Each primitive model is defined by a set of learned parameters. Given new video data and given the estimated pose of the subject appearing on the video, the motion is segmented into primitives, which are recognized with a probability given according to the parameters of the learned models. Using our framework we build a publicly available dataset of human motion primitives, using sequences taken from well-known motion capture datasets. We expect that our framework, by providing an objective way for discovering and categorizing human motion, will be a useful tool in numerous research fields including video analysis, human inspired motion generation, learning by demonstration, intuitive human-robot interaction, and human behavior analysis.
2019
human motion; activity recognition; motion generation; behavior analysis; datasets; robotics; non-parametric Bayes;
01 Pubblicazione su rivista::01a Articolo in rivista
Discovery and recognition of motion primitives in human activities / Sanzari, Marta; Ntouskos, Valsamis; Pirri, Fiora. - In: PLOS ONE. - ISSN 1932-6203. - 14:4(2019). [10.1371/journal.pone.0214499]
File allegati a questo prodotto
File Dimensione Formato  
Sanzari_Discovery-and-recognition_2019.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 6.29 MB
Formato Adobe PDF
6.29 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1260530
Citazioni
  • ???jsp.display-item.citation.pmc??? 3
  • Scopus 11
  • ???jsp.display-item.citation.isi??? 10
social impact