Optical navigation for guidance and control of robotic systems is a well-established technique from both theoretic and practical points of view. According to the positioning of the camera, the problem can be approached in two ways: the first one, "hand-in-eye", deals with a fixed camera, external to the robot, which allows to determine the position of the target object to be reached. The second one, "eye-in-hand", consists in a camera accommodated on the end-effector of the manipulator. The target object position in this case is not determined in an absolute reference frame, but with respect to the image plane of the mobile camera. In this paper, the algorithms and the test campaign applied to the case of the planar multibody manipulator developed in the Guidance and Navigation Lab at the University of Rome La Sapienza are reported with respect to the eye-in-hand case. In fact, being the space environment the target application for this research activity, it is quite difficult to imagine a fixed, non-floating camera in the case of an orbital grasping maneuver. The classic approach of Image Base Visual Servoing considers the evaluation of the control actions directly on the basis of the error between the current image of a feature and the image of the same feature in a final desired configuration. Both simulation and experimental tests show that such a classic approach can fail when navigation errors and actuation delays are included. Moreover, changing light conditions or the presence of unexpected obstacles can lead to a camera failure in target acquisition. In order to overcome these two problems, a Modified Image Based Visual Servoing algorithm and an Extended Kalman Filtering for feature position estimation are developed and applied. In particular, the filtering shows a quite good performance if target's depth information are supplied. A simple procedure for estimating initial target depth is therefore developed and tested. As a result of the application of all the novel approaches proposed, the experimental test campaign shows a remarkable increase in the robustness of the guidance, navigation and control system. Copyright ©2010 by the International Astronautical Federation. All rights reserved.

Adaptive and robust algorithms and tests for visual-based navigation of a space robotic manipulator / Sabatini, Marco; Monti, Riccardo; Gasbarri, Paolo; Palmerini, Giovanni Battista. - STAMPA. - 6:(2011), pp. 5265-5280. (Intervento presentato al convegno 62nd International Astronautical Congress 2011, IAC 2011 tenutosi a Cape Town; South Africa nel 3 October 2011 through 7 October 2011).

Adaptive and robust algorithms and tests for visual-based navigation of a space robotic manipulator

SABATINI, MARCO;MONTI, RICCARDO;GASBARRI, Paolo;PALMERINI, Giovanni Battista
2011

Abstract

Optical navigation for guidance and control of robotic systems is a well-established technique from both theoretic and practical points of view. According to the positioning of the camera, the problem can be approached in two ways: the first one, "hand-in-eye", deals with a fixed camera, external to the robot, which allows to determine the position of the target object to be reached. The second one, "eye-in-hand", consists in a camera accommodated on the end-effector of the manipulator. The target object position in this case is not determined in an absolute reference frame, but with respect to the image plane of the mobile camera. In this paper, the algorithms and the test campaign applied to the case of the planar multibody manipulator developed in the Guidance and Navigation Lab at the University of Rome La Sapienza are reported with respect to the eye-in-hand case. In fact, being the space environment the target application for this research activity, it is quite difficult to imagine a fixed, non-floating camera in the case of an orbital grasping maneuver. The classic approach of Image Base Visual Servoing considers the evaluation of the control actions directly on the basis of the error between the current image of a feature and the image of the same feature in a final desired configuration. Both simulation and experimental tests show that such a classic approach can fail when navigation errors and actuation delays are included. Moreover, changing light conditions or the presence of unexpected obstacles can lead to a camera failure in target acquisition. In order to overcome these two problems, a Modified Image Based Visual Servoing algorithm and an Extended Kalman Filtering for feature position estimation are developed and applied. In particular, the filtering shows a quite good performance if target's depth information are supplied. A simple procedure for estimating initial target depth is therefore developed and tested. As a result of the application of all the novel approaches proposed, the experimental test campaign shows a remarkable increase in the robustness of the guidance, navigation and control system. Copyright ©2010 by the International Astronautical Federation. All rights reserved.
2011
62nd International Astronautical Congress 2011, IAC 2011
guidance navigation and control; optical navigation; robotic arms
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Adaptive and robust algorithms and tests for visual-based navigation of a space robotic manipulator / Sabatini, Marco; Monti, Riccardo; Gasbarri, Paolo; Palmerini, Giovanni Battista. - STAMPA. - 6:(2011), pp. 5265-5280. (Intervento presentato al convegno 62nd International Astronautical Congress 2011, IAC 2011 tenutosi a Cape Town; South Africa nel 3 October 2011 through 7 October 2011).
File allegati a questo prodotto
File Dimensione Formato  
IAC-11,C1,8,4,x11084.pdf

solo gestori archivio

Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 622.65 kB
Formato Adobe PDF
622.65 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/385227
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 31
social impact