Resident Space Objects (RSOs) detection and tracking are relevant challenges in the framework of Space Situational Awareness (SSA). The growing number of active and inactive platforms and the incoming era of mega constellations is increasing the traffic in the near Earth segment. Recently, more and more research efforts have been focused on this problem. This, combined with the popularity of Artificial Intelligence (AI) applications, has led to interesting solutions. The potential of an AI based approach for image processing, objects detection and tracking oriented to space optical sensors applications has already been proved. In this work, the architecture of a Convolutional Neural Network (CNN) based algorithm has been developed and tested. The image processing and object detection tasks are demanded to Neural Network (NN) modules (U-Net and YOLO v4, respectively) while the tracking of objects inside the sensor’s Field Of View (FOV) is formulated as an optimization problem. A performance comparison in terms of detection capabilities has been carried out with respect to a previous version of the algorithm based on YOLO v3. Reported results, based on real and simulated night sky images, show a notable performance improvement from v3 to v4.

YOLO v4 Based Algorithm for Resident Space Object Detection and Tracking / Mastrofini, M.; Goracci, G.; Agostinelli, I.; Curti, F.. - 1088:(2023), pp. 19-33. (Intervento presentato al convegno 2nd International Conference on Applied Intelligence and Informatics , AII 2022 tenutosi a Reggio Calabria, Italy) [10.1007/978-3-031-25755-1_2].

YOLO v4 Based Algorithm for Resident Space Object Detection and Tracking

Mastrofini M.
Primo
Conceptualization
;
Goracci G.
Secondo
Software
;
Agostinelli I.
Penultimo
Writing – Original Draft Preparation
;
Curti F.
Ultimo
Writing – Review & Editing
2023

Abstract

Resident Space Objects (RSOs) detection and tracking are relevant challenges in the framework of Space Situational Awareness (SSA). The growing number of active and inactive platforms and the incoming era of mega constellations is increasing the traffic in the near Earth segment. Recently, more and more research efforts have been focused on this problem. This, combined with the popularity of Artificial Intelligence (AI) applications, has led to interesting solutions. The potential of an AI based approach for image processing, objects detection and tracking oriented to space optical sensors applications has already been proved. In this work, the architecture of a Convolutional Neural Network (CNN) based algorithm has been developed and tested. The image processing and object detection tasks are demanded to Neural Network (NN) modules (U-Net and YOLO v4, respectively) while the tracking of objects inside the sensor’s Field Of View (FOV) is formulated as an optimization problem. A performance comparison in terms of detection capabilities has been carried out with respect to a previous version of the algorithm based on YOLO v3. Reported results, based on real and simulated night sky images, show a notable performance improvement from v3 to v4.
2023
2nd International Conference on Applied Intelligence and Informatics , AII 2022
space systems; resident space objects; artificial intelligence; star sensors
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
YOLO v4 Based Algorithm for Resident Space Object Detection and Tracking / Mastrofini, M.; Goracci, G.; Agostinelli, I.; Curti, F.. - 1088:(2023), pp. 19-33. (Intervento presentato al convegno 2nd International Conference on Applied Intelligence and Informatics , AII 2022 tenutosi a Reggio Calabria, Italy) [10.1007/978-3-031-25755-1_2].
File allegati a questo prodotto
File Dimensione Formato  
Mastrofini_Yolo_2023.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.93 MB
Formato Adobe PDF
2.93 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1691493
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact