In the framework of Human-Robot interaction, a robot and a human operator may need to move in close coordination within the same workspace. In this thesis, the contactless human-robot collaboration with coordinated motion tasks is considered. A control system based on multiple sensors is presented for safe and efficient collaboration. A contactless coordinated motion can be achieved using vision, mounting a camera either on the robot end-effector or on the human. We consider here a visual coordination task, with the robot end-effector that should maintain a prescribed position with respect to a moving RGB-D camera while pointing at it. For the 3D localization of the moving camera, we compare three different techniques and introduce some improvements to the best solution found for our application. Instead, an Oculus Rift HMD system can be used to track an operator moving in the workspace. In this case, a stereo camera is used to perform a mixed-reality interface that enables the user to choose between different collaboration modes. The robot should also avoid any collision with the operator and with nearby static or dynamic obstacles, based on distance computations performed in the depth space of a fixed Kinect sensor. To exploit effectively and efficiently the advantage of robot redundancy, different soft constraints for both the coordinated motion and collision avoidance tasks are proposed. Two relaxed versions of the pointing part of the task are introduced to achieve the desired task without exhausting the robot capabilities. Also, a relaxed formulation for collision avoidance task, that does not slack the avoidance performance, is used. Several control algorithms with different complexity are developed to suitably combine and organize the simultaneous control tasks with their priority. The proposed control system using different approaches is validated by V-REP and MATLAB simulations, and experiments with the 7-dof KUKA LWR manipulator.

Multi-sensor coordination in human-robot interaction / Khatib, Maram. - (2020 Feb 21).

Multi-sensor coordination in human-robot interaction

KHATIB, MARAM
21/02/2020

Abstract

In the framework of Human-Robot interaction, a robot and a human operator may need to move in close coordination within the same workspace. In this thesis, the contactless human-robot collaboration with coordinated motion tasks is considered. A control system based on multiple sensors is presented for safe and efficient collaboration. A contactless coordinated motion can be achieved using vision, mounting a camera either on the robot end-effector or on the human. We consider here a visual coordination task, with the robot end-effector that should maintain a prescribed position with respect to a moving RGB-D camera while pointing at it. For the 3D localization of the moving camera, we compare three different techniques and introduce some improvements to the best solution found for our application. Instead, an Oculus Rift HMD system can be used to track an operator moving in the workspace. In this case, a stereo camera is used to perform a mixed-reality interface that enables the user to choose between different collaboration modes. The robot should also avoid any collision with the operator and with nearby static or dynamic obstacles, based on distance computations performed in the depth space of a fixed Kinect sensor. To exploit effectively and efficiently the advantage of robot redundancy, different soft constraints for both the coordinated motion and collision avoidance tasks are proposed. Two relaxed versions of the pointing part of the task are introduced to achieve the desired task without exhausting the robot capabilities. Also, a relaxed formulation for collision avoidance task, that does not slack the avoidance performance, is used. Several control algorithms with different complexity are developed to suitably combine and organize the simultaneous control tasks with their priority. The proposed control system using different approaches is validated by V-REP and MATLAB simulations, and experiments with the 7-dof KUKA LWR manipulator.
21-feb-2020
File allegati a questo prodotto
File Dimensione Formato  
Tesi_dottorato_Khatib.pdf

accesso aperto

Tipologia: Tesi di dottorato
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 25.34 MB
Formato Adobe PDF
25.34 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1366005
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact