Visual Odometry is a key technique used by planetary exploration rovers to achieve high localization accuracies during the motion. Visual Odometry methods allow to reconstruct the position and attitude of moving assets by processing visual inputs acquired by the onboard stereocameras. Vision-based localization techniques have supported the operations of planetary rovers during the last two decades, including the NASA's Mars 2020 mission with the Perseverance rover. Our work is focused on the analysis of the stereo-pairs acquired by Perseverance's navigation cameras on sol 65 to retrieve the vehicle's position and attitude along the path. The images are processed by using a 3D-to-3D stereo vision-based algorithm that employs the CAHVORE camera model to account for nonlinear optical effects characterizing the raw images. By comparing the triangulated coordinates of the same set of landmarks observed before and after the entire drive, we show that the trajectory retrieved with our Visual Odometry software improves the archived rover's position.

Visual odometry analysis of the NASA mars 2020 perseverance rover's images / Andolfo, S; Petricca, F; Genova, A. - (2022), pp. 287-292. (Intervento presentato al convegno 9th IEEE International Workshop on Metrology for AeroSpace, MetroAeroSpace 2022 tenutosi a Pisa; Italy) [10.1109/MetroAeroSpace54187.2022.9856188].

Visual odometry analysis of the NASA mars 2020 perseverance rover's images

Andolfo, S
;
Petricca, F;Genova, A
2022

Abstract

Visual Odometry is a key technique used by planetary exploration rovers to achieve high localization accuracies during the motion. Visual Odometry methods allow to reconstruct the position and attitude of moving assets by processing visual inputs acquired by the onboard stereocameras. Vision-based localization techniques have supported the operations of planetary rovers during the last two decades, including the NASA's Mars 2020 mission with the Perseverance rover. Our work is focused on the analysis of the stereo-pairs acquired by Perseverance's navigation cameras on sol 65 to retrieve the vehicle's position and attitude along the path. The images are processed by using a 3D-to-3D stereo vision-based algorithm that employs the CAHVORE camera model to account for nonlinear optical effects characterizing the raw images. By comparing the triangulated coordinates of the same set of landmarks observed before and after the entire drive, we show that the trajectory retrieved with our Visual Odometry software improves the archived rover's position.
2022
9th IEEE International Workshop on Metrology for AeroSpace, MetroAeroSpace 2022
visual odometry; rovers; space robotic systems
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Visual odometry analysis of the NASA mars 2020 perseverance rover's images / Andolfo, S; Petricca, F; Genova, A. - (2022), pp. 287-292. (Intervento presentato al convegno 9th IEEE International Workshop on Metrology for AeroSpace, MetroAeroSpace 2022 tenutosi a Pisa; Italy) [10.1109/MetroAeroSpace54187.2022.9856188].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1659069
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 2
social impact