Enabling a humanoid robot to drive a car requires the development of a set of basic primitive actions. These include walking to the vehicle, manually controlling its commands (e.g., ignition, gas pedal, and steering) and moving with the whole body to ingress/egress the car. We present a sensor-based reactive framework for realizing the central part of the complete task, consisting of driving the car along unknown roads. The proposed framework provides three driving strategies by which a human supervisor can teleoperate the car or give the robot full or partial control of the car. A visual servoing scheme uses features of the road image to provide the reference angle for the steering wheel to drive the car at the center of the road. Simultaneously, a Kalman filter merges optical flow and accelerometer measurements to estimate the car linear velocity and correspondingly compute the gas pedal command for driving at a desired speed. The steering wheel and gas pedal reference are sent to the robot control to achieve the driving task with the humanoid. We present results from a driving experience with a real car and the humanoid robot HRP-2Kai. Part of the framework has been used to perform the driving task at the DARPA Robotics Challenge.

Autonomous car driving by a humanoid robot / Paolillo, Antonio; Gergondet, Pierre; Cherubini, Andrea; Vendittelli, Marilena; Kheddar, Abderrahmane. - In: JOURNAL OF FIELD ROBOTICS. - ISSN 1556-4959. - ELETTRONICO. - 35:2(2017), pp. 169-186. [10.1002/rob.21731]

Autonomous car driving by a humanoid robot

PAOLILLO, ANTONIO;VENDITTELLI, Marilena;
2017

Abstract

Enabling a humanoid robot to drive a car requires the development of a set of basic primitive actions. These include walking to the vehicle, manually controlling its commands (e.g., ignition, gas pedal, and steering) and moving with the whole body to ingress/egress the car. We present a sensor-based reactive framework for realizing the central part of the complete task, consisting of driving the car along unknown roads. The proposed framework provides three driving strategies by which a human supervisor can teleoperate the car or give the robot full or partial control of the car. A visual servoing scheme uses features of the road image to provide the reference angle for the steering wheel to drive the car at the center of the road. Simultaneously, a Kalman filter merges optical flow and accelerometer measurements to estimate the car linear velocity and correspondingly compute the gas pedal command for driving at a desired speed. The steering wheel and gas pedal reference are sent to the robot control to achieve the driving task with the humanoid. We present results from a driving experience with a real car and the humanoid robot HRP-2Kai. Part of the framework has been used to perform the driving task at the DARPA Robotics Challenge.
2017
autonomous driving; humanoid robots; visual control
01 Pubblicazione su rivista::01a Articolo in rivista
Autonomous car driving by a humanoid robot / Paolillo, Antonio; Gergondet, Pierre; Cherubini, Andrea; Vendittelli, Marilena; Kheddar, Abderrahmane. - In: JOURNAL OF FIELD ROBOTICS. - ISSN 1556-4959. - ELETTRONICO. - 35:2(2017), pp. 169-186. [10.1002/rob.21731]
File allegati a questo prodotto
File Dimensione Formato  
Paolillo_Autonomous_2018.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.53 MB
Formato Adobe PDF
2.53 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/979535
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 19
  • ???jsp.display-item.citation.isi??? 14
social impact