Moving object detection in video streams plays a key role in many computer vision applications. In particular, separation between background and foreground items represents a main prerequisite to carry out more complex tasks, such as object classification, vehicle tracking, and person re-identification. Despite the progress made in recent years, a main challenge of moving object detection still regards the management of dynamic aspects, including bootstrapping and illumination changes. In addition, the recent widespread of Pan-Tilt-Zoom (PTZ) cameras has made the management of these aspects even more complex in terms of performance due to their mixed movements (i.e. pan, tilt, and zoom). In this paper, a combined keypoint clustering and neural background subtraction method, based on Self-Organized Neural Network (SONN), for real-time moving object detection in video sequences acquired by PTZ cameras is proposed. Initially, the method performs a spatio-temporal tracking of the sets of moving keypoints to recognize the foreground areas and to establish the background. Then, it adopts a neural background subtraction, localized in these areas, to accomplish a foreground detection able to manage bootstrapping and gradual illumination changes. Experimental results on three well-known public datasets, and comparisons with different key works of the current literature, show the efficiency of the proposed method in terms of modeling and background subtraction.
Fusing Self-Organized Neural Network and Keypoint Clustering for Localized Real-Time Background Subtraction / Avola, D.; Bernardi, M.; Cinque, L.; Massaroni, C.; Foresti, G. L.. - In: INTERNATIONAL JOURNAL OF NEURAL SYSTEMS. - ISSN 0129-0657. - 30:4(2020), p. 2050016. [10.1142/S0129065720500161]
Fusing Self-Organized Neural Network and Keypoint Clustering for Localized Real-Time Background Subtraction
Avola D.Primo
;Cinque L.;Massaroni C.;Foresti G. L.
2020
Abstract
Moving object detection in video streams plays a key role in many computer vision applications. In particular, separation between background and foreground items represents a main prerequisite to carry out more complex tasks, such as object classification, vehicle tracking, and person re-identification. Despite the progress made in recent years, a main challenge of moving object detection still regards the management of dynamic aspects, including bootstrapping and illumination changes. In addition, the recent widespread of Pan-Tilt-Zoom (PTZ) cameras has made the management of these aspects even more complex in terms of performance due to their mixed movements (i.e. pan, tilt, and zoom). In this paper, a combined keypoint clustering and neural background subtraction method, based on Self-Organized Neural Network (SONN), for real-time moving object detection in video sequences acquired by PTZ cameras is proposed. Initially, the method performs a spatio-temporal tracking of the sets of moving keypoints to recognize the foreground areas and to establish the background. Then, it adopts a neural background subtraction, localized in these areas, to accomplish a foreground detection able to manage bootstrapping and gradual illumination changes. Experimental results on three well-known public datasets, and comparisons with different key works of the current literature, show the efficiency of the proposed method in terms of modeling and background subtraction.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.