The ability to learn a map of the environment is important for numerous types of robotic vehicles. In this paper, we address the problem of learning a visual map of the ground using flying vehicles. We assume that the vehicles are equipped with one or two low-cost downlooking cameras in combination with an attitude sensor. Our approach is able to construct a visual map that can later on be used for navigation. Key advantages of our approach are that it is comparably easy to implement, can robustly deal with noisy camera images, and can operate either with a monocular camera or a stereo camera system. Our technique uses visual features and estimates the correspondences between features using a variant of the progressive sample consensus (PROSAC) algorithm. This allows our approach to extract spatial constraints between camera poses that can then be used to address the simultaneous localization and mapping (SLAM) problem by applying graph methods. Furthermore, we address the problem of efficiently identifying loop closures. We performed several experiments with flying vehicles that demonstrate that our method is able to construct maps of large outdoor and indoor environments. © 2008 IEEE.

Visual SLAM for flying vehicles / B., Steder; Grisetti, Giorgio; C., Stachniss; W., Burgard. - In: IEEE TRANSACTIONS ON ROBOTICS. - ISSN 1552-3098. - 24:5(2008), pp. 1088-1093. [10.1109/tro.2008.2004521]

Visual SLAM for flying vehicles

GRISETTI, GIORGIO
;
2008

Abstract

The ability to learn a map of the environment is important for numerous types of robotic vehicles. In this paper, we address the problem of learning a visual map of the ground using flying vehicles. We assume that the vehicles are equipped with one or two low-cost downlooking cameras in combination with an attitude sensor. Our approach is able to construct a visual map that can later on be used for navigation. Key advantages of our approach are that it is comparably easy to implement, can robustly deal with noisy camera images, and can operate either with a monocular camera or a stereo camera system. Our technique uses visual features and estimates the correspondences between features using a variant of the progressive sample consensus (PROSAC) algorithm. This allows our approach to extract spatial constraints between camera poses that can then be used to address the simultaneous localization and mapping (SLAM) problem by applying graph methods. Furthermore, we address the problem of efficiently identifying loop closures. We performed several experiments with flying vehicles that demonstrate that our method is able to construct maps of large outdoor and indoor environments. © 2008 IEEE.
2008
attitude sensor; flying vehicles; simultaneous localization and mapping (slam); vision
01 Pubblicazione su rivista::01a Articolo in rivista
Visual SLAM for flying vehicles / B., Steder; Grisetti, Giorgio; C., Stachniss; W., Burgard. - In: IEEE TRANSACTIONS ON ROBOTICS. - ISSN 1552-3098. - 24:5(2008), pp. 1088-1093. [10.1109/tro.2008.2004521]
File allegati a questo prodotto
File Dimensione Formato  
Steder_Postprint_Visual-SLAM_2008.pdf

accesso aperto

Note: https://ieeexplore.ieee.org/document/4636756
Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 444.03 kB
Formato Adobe PDF
444.03 kB Adobe PDF
Steder_Visual-SLAM_2008.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 310.35 kB
Formato Adobe PDF
310.35 kB Adobe PDF   Contatta l'autore
VE_2008_11573-137100.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 313.87 kB
Formato Adobe PDF
313.87 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/137100
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 64
  • ???jsp.display-item.citation.isi??? 41
social impact