Color-depth cameras (RGB-D cameras) have become the primary sensors in most robotics systems, from service robotics to industrial robotics applications. Typical consumer-grade RGB-D cameras are provided with a coarse intrinsic and extrinsic calibration that generally does not meet the accuracy requirements needed by many robotics applications [e.g., highly accurate three-dimensional (3-D) environment reconstruction and mapping, high precision object recognition, localization, etc.]. In this paper, we propose a human-friendly, reliable, and accurate calibration framework that enables to easily estimate both the intrinsic and extrinsic parameters of a general color-depth sensor couple. Our approach is based on a novel two components error model. This model unifies the error sources of RGB-D pairs based on different technologies, such as structured-light 3-D cameras and time-of-flight cameras. Our method provides some important advantages compared to other state-of-the-art systems: It is general (i.e., well suited for different types of sensors), based on an easy and stable calibration protocol, provides a greater calibration accuracy, and has been implemented within the robot operating system robotics framework. We report detailed experimental validations and performance comparisons to support our statements.

Robust Intrinsic and Extrinsic Calibration of RGB-D Cameras / Basso, Filippo; Menegatti, Emanuele; Pretto, Alberto. - In: IEEE TRANSACTIONS ON ROBOTICS. - ISSN 1552-3098. - 34:5(2018), pp. 1315-1332. [10.1109/TRO.2018.2853742]

Robust Intrinsic and Extrinsic Calibration of RGB-D Cameras

Alberto Pretto
2018

Abstract

Color-depth cameras (RGB-D cameras) have become the primary sensors in most robotics systems, from service robotics to industrial robotics applications. Typical consumer-grade RGB-D cameras are provided with a coarse intrinsic and extrinsic calibration that generally does not meet the accuracy requirements needed by many robotics applications [e.g., highly accurate three-dimensional (3-D) environment reconstruction and mapping, high precision object recognition, localization, etc.]. In this paper, we propose a human-friendly, reliable, and accurate calibration framework that enables to easily estimate both the intrinsic and extrinsic parameters of a general color-depth sensor couple. Our approach is based on a novel two components error model. This model unifies the error sources of RGB-D pairs based on different technologies, such as structured-light 3-D cameras and time-of-flight cameras. Our method provides some important advantages compared to other state-of-the-art systems: It is general (i.e., well suited for different types of sensors), based on an easy and stable calibration protocol, provides a greater calibration accuracy, and has been implemented within the robot operating system robotics framework. We report detailed experimental validations and performance comparisons to support our statements.
2018
Calibration; Camera calibration; camera pairs; Cameras; depth cameras; Robot vision systems; Sensor systems; Computer Vision and Pattern Recognition;
01 Pubblicazione su rivista::01a Articolo in rivista
Robust Intrinsic and Extrinsic Calibration of RGB-D Cameras / Basso, Filippo; Menegatti, Emanuele; Pretto, Alberto. - In: IEEE TRANSACTIONS ON ROBOTICS. - ISSN 1552-3098. - 34:5(2018), pp. 1315-1332. [10.1109/TRO.2018.2853742]
File allegati a questo prodotto
File Dimensione Formato  
Basso_Robust-Intrinsic_2018.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 5.61 MB
Formato Adobe PDF
5.61 MB Adobe PDF   Contatta l'autore
Basso_Postoprint_Robust-Intrinsic_2018.pdf

accesso aperto

Note: https://ieeexplore.ieee.org/document/8423784
Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 4.7 MB
Formato Adobe PDF
4.7 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1172982
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 60
  • ???jsp.display-item.citation.isi??? 52
social impact