This paper addresses the problem of no reference visual quality assessment in point clouds, useful for extended reality communication service such as remote surgery and education. Accurate, computationally efficient metrics for point cloud visual quality are needed, but the literature lacks a unified view of the signal attributes associated with psycho-visual features of point clouds.We propose a method to estimate point cloud visual quality by jointly assessing geometry and color noise in areas critical to visual perception. Our approach introduces a novel descriptor that identifies visual degradation through variations in local color and normal components. This descriptor, along with its saliency-weighted variant, leverages Non-Euclidean Laplacian Filtering (NEUF). The NEUF algorithm extracts descriptors using advanced non-Euclidean filtering techniques and employs regression learning to predict end-users’ perceived subjective quality by optimally selecting the most relevant features. We show by simulations that the NEUF approach mixing non Euclidean filtering with data-driven learning is effective on various point cloud categories, including human, non-human, natural, and synthetic. The NEUF-based method for point cloud quality estimation outperforms existing methods, offering a no-reference quality estimator that can be used for the design of cutting-edge extended reality services.

NEUF: Learning point cloud quality by Non-Euclidean Fast Filtering / Salvo, Eleonora Di; Beghdadi, Azeddine; Cattai, Tiziana; Lumare, Chiara; Scarano, Gaetano; Colonnese, Stefania. - In: IEEE ACCESS. - ISSN 2169-3536. - 11:(2025), pp. 1-14. [10.1109/access.2025.3567190]

NEUF: Learning point cloud quality by Non-Euclidean Fast Filtering

Salvo, Eleonora Di;Cattai, Tiziana;Scarano, Gaetano;Colonnese, Stefania
2025

Abstract

This paper addresses the problem of no reference visual quality assessment in point clouds, useful for extended reality communication service such as remote surgery and education. Accurate, computationally efficient metrics for point cloud visual quality are needed, but the literature lacks a unified view of the signal attributes associated with psycho-visual features of point clouds.We propose a method to estimate point cloud visual quality by jointly assessing geometry and color noise in areas critical to visual perception. Our approach introduces a novel descriptor that identifies visual degradation through variations in local color and normal components. This descriptor, along with its saliency-weighted variant, leverages Non-Euclidean Laplacian Filtering (NEUF). The NEUF algorithm extracts descriptors using advanced non-Euclidean filtering techniques and employs regression learning to predict end-users’ perceived subjective quality by optimally selecting the most relevant features. We show by simulations that the NEUF approach mixing non Euclidean filtering with data-driven learning is effective on various point cloud categories, including human, non-human, natural, and synthetic. The NEUF-based method for point cloud quality estimation outperforms existing methods, offering a no-reference quality estimator that can be used for the design of cutting-edge extended reality services.
2025
measurement; visualization; point cloud compression; three-dimensional displays; filtering; estimation; distortion; geometry; color; quality assessment; point cloud; extended reality; quality metric
01 Pubblicazione su rivista::01a Articolo in rivista
NEUF: Learning point cloud quality by Non-Euclidean Fast Filtering / Salvo, Eleonora Di; Beghdadi, Azeddine; Cattai, Tiziana; Lumare, Chiara; Scarano, Gaetano; Colonnese, Stefania. - In: IEEE ACCESS. - ISSN 2169-3536. - 11:(2025), pp. 1-14. [10.1109/access.2025.3567190]
File allegati a questo prodotto
File Dimensione Formato  
Di-Salvo_NEUF_2025.pdf

accesso aperto

Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Creative commons
Dimensione 9.12 MB
Formato Adobe PDF
9.12 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1738163
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact