Human-like trajectory generation and footstep planning represent challenging problems in humanoid robotics. Recently, research in computer graphics investigated machine-learning methods for character animation based on training human-like models directly on motion capture data. Such methods proved effective in virtual environments, mainly focusing on trajectory visualization. This paper presents ADHERENT, a system architecture integrating machine-learning methods used in computer graphics with whole-body control methods employed in robotics to generate and stabilize human-like trajectories for humanoid robots. Leveraging human motion capture locomotion data, ADHERENT yields a general footstep planner, including forward, sideways, and backward walking trajectories that blend smoothly from one to another. Furthermore, at the joint configuration level, ADHERENT computes data-driven whole-body postural reference trajectories coherent with the generated footsteps, thus increasing the human likeness of the resulting robot motion. Extensive validations of the proposed architecture are presented with both simulations and real experiments on the iCub humanoid robot, thus demonstrating ADHERENT to be robust to varying step sizes and walking speeds.

ADHERENT: Learning Human-like Trajectory Generators for Whole-body Control of Humanoid Robots / Viceconte, Paolo Maria; Camoriano, Raffaello; Romualdi, Giulio; Ferigo, Diego; Dafarra, Stefano; Traversaro, Silvio; Oriolo, Giuseppe; Rosasco, Lorenzo; Pucci, Daniele. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 7:2(2022), pp. 2779-2786. [10.1109/LRA.2022.3141658]

ADHERENT: Learning Human-like Trajectory Generators for Whole-body Control of Humanoid Robots

Viceconte, Paolo Maria
Primo
;
Oriolo, Giuseppe;
2022

Abstract

Human-like trajectory generation and footstep planning represent challenging problems in humanoid robotics. Recently, research in computer graphics investigated machine-learning methods for character animation based on training human-like models directly on motion capture data. Such methods proved effective in virtual environments, mainly focusing on trajectory visualization. This paper presents ADHERENT, a system architecture integrating machine-learning methods used in computer graphics with whole-body control methods employed in robotics to generate and stabilize human-like trajectories for humanoid robots. Leveraging human motion capture locomotion data, ADHERENT yields a general footstep planner, including forward, sideways, and backward walking trajectories that blend smoothly from one to another. Furthermore, at the joint configuration level, ADHERENT computes data-driven whole-body postural reference trajectories coherent with the generated footsteps, thus increasing the human likeness of the resulting robot motion. Extensive validations of the proposed architecture are presented with both simulations and real experiments on the iCub humanoid robot, thus demonstrating ADHERENT to be robust to varying step sizes and walking speeds.
2022
Humanoid Robot Systems; Machine Learning for Robot Control; Whole-Body Motion Planning and Control;
01 Pubblicazione su rivista::01a Articolo in rivista
ADHERENT: Learning Human-like Trajectory Generators for Whole-body Control of Humanoid Robots / Viceconte, Paolo Maria; Camoriano, Raffaello; Romualdi, Giulio; Ferigo, Diego; Dafarra, Stefano; Traversaro, Silvio; Oriolo, Giuseppe; Rosasco, Lorenzo; Pucci, Daniele. - In: IEEE ROBOTICS AND AUTOMATION LETTERS. - ISSN 2377-3766. - 7:2(2022), pp. 2779-2786. [10.1109/LRA.2022.3141658]
File allegati a questo prodotto
File Dimensione Formato  
Viceconte_ADHERENT_2022.pdf

accesso aperto

Note: https://ieeexplore.ieee.org/document/9676410
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 2.41 MB
Formato Adobe PDF
2.41 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1604129
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
  • ???jsp.display-item.citation.isi??? 3
social impact