The future space robotic missions will involve autonomous assets to explore remote areas in the Solar System in safety and with limited assistance from ground operators. Although planetary rovers' activities are still mainly scheduled and planned from Earth, current research trends are focused on increasing rovers' capabilities to autonomously make decisions based on their perception of the surroundings. To enhance the efficiency of future robotic assets, we are developing a guidance, navigation and control (GNC) system to enable safe operations on heterogeneous unprepared terrains. The software accounts for an accurate modeling of the rover's dynamical equations to simulate traverses across different terrains and slope conditions. Furthermore, by processing the visual input from the left- and right-eye cameras, local 3D maps of the rover's neighborhoods are built to support the path planning activities, and to enhance localization accuracies through Visual Odometry (VO) techniques. VO methods are currently employed by planetary rovers to provide accurate relative pose updates during short- and mid-range traverses by detecting and tracking image-keypoints through successive pairs of stereo images. VO algorithms enable higher localization accuracies compared to dead-reckoning methods based on Wheel Odometry (WO) data, which are prone to overestimate the travelled distance due to wheel's slippage. To assess the localization performances of our GNC system, we employed VO techniques to process stereo images acquired by the NASA Mars 2020 rover, Perseverance. Currently exploring the Martian Jezero delta, Perseverance represents the state-of-the-art in planetary surface exploration, with dedicated hardware to perform demanding computer vision tasks that enable the rover to autonomously drive for hundreds of meters in a single Martian day. We present here the methods we used to reconstruct Perseverance's position and attitude along different traverses that the vehicle performed during the first year of the mission. We retrieved the rover's pose by processing the high-resolution images captured by the stereo navigation cameras in a 3D-to-3D VO framework that also accounts for an accurate modeling of the nonlinear effects characterizing the camera optics. The VO-based estimate of the rover's attitude is fully in line with the orientation provided by the accurate Inertial Measurements Units (IMU) data. The discrepancies between the reconstructed and the telemetered rover's position across the site map suggest errors in the WO measurements, which are mitigated by our VO estimate.

Semi-autonomous Guidance, Navigation and Control system for planetary rovers / Andolfo, S.; Del Vecchio, E.; Gargiulo, A. M.; Petricca, F.; Genova, A.. - 2022-:(2022). (Intervento presentato al convegno 73rd International Astronautical Congress, IAC 2022 tenutosi a Parigi, Francia).

Semi-autonomous Guidance, Navigation and Control system for planetary rovers

Andolfo S.
;
Del Vecchio E.;Gargiulo A. M.;Petricca F.;Genova A.
2022

Abstract

The future space robotic missions will involve autonomous assets to explore remote areas in the Solar System in safety and with limited assistance from ground operators. Although planetary rovers' activities are still mainly scheduled and planned from Earth, current research trends are focused on increasing rovers' capabilities to autonomously make decisions based on their perception of the surroundings. To enhance the efficiency of future robotic assets, we are developing a guidance, navigation and control (GNC) system to enable safe operations on heterogeneous unprepared terrains. The software accounts for an accurate modeling of the rover's dynamical equations to simulate traverses across different terrains and slope conditions. Furthermore, by processing the visual input from the left- and right-eye cameras, local 3D maps of the rover's neighborhoods are built to support the path planning activities, and to enhance localization accuracies through Visual Odometry (VO) techniques. VO methods are currently employed by planetary rovers to provide accurate relative pose updates during short- and mid-range traverses by detecting and tracking image-keypoints through successive pairs of stereo images. VO algorithms enable higher localization accuracies compared to dead-reckoning methods based on Wheel Odometry (WO) data, which are prone to overestimate the travelled distance due to wheel's slippage. To assess the localization performances of our GNC system, we employed VO techniques to process stereo images acquired by the NASA Mars 2020 rover, Perseverance. Currently exploring the Martian Jezero delta, Perseverance represents the state-of-the-art in planetary surface exploration, with dedicated hardware to perform demanding computer vision tasks that enable the rover to autonomously drive for hundreds of meters in a single Martian day. We present here the methods we used to reconstruct Perseverance's position and attitude along different traverses that the vehicle performed during the first year of the mission. We retrieved the rover's pose by processing the high-resolution images captured by the stereo navigation cameras in a 3D-to-3D VO framework that also accounts for an accurate modeling of the nonlinear effects characterizing the camera optics. The VO-based estimate of the rover's attitude is fully in line with the orientation provided by the accurate Inertial Measurements Units (IMU) data. The discrepancies between the reconstructed and the telemetered rover's position across the site map suggest errors in the WO measurements, which are mitigated by our VO estimate.
2022
73rd International Astronautical Congress, IAC 2022
Autonomous Navigation; Path Planning; Planetary Exploration Rovers; Visual Odometry
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Semi-autonomous Guidance, Navigation and Control system for planetary rovers / Andolfo, S.; Del Vecchio, E.; Gargiulo, A. M.; Petricca, F.; Genova, A.. - 2022-:(2022). (Intervento presentato al convegno 73rd International Astronautical Congress, IAC 2022 tenutosi a Parigi, Francia).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1701519
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact