A traditional mug shot is a front and side view of a person from the shoulder up, taken by law enforcement. Forensic science is exploring the benefit of working with 3D data offered by new technologies, and there is an increasing need to work with 3D mug shots. Among the various available techniques, a multi-view photogrammetric approach achieves the highest accuracy in the shortest acquisition time. In this work, a multi-view photogrammetric system for facial reconstruction based on low-cost cameras is developed with the aims of verifying the performance of such cameras for the production of a 3D mug shot with submillimetre accuracy and assessing the improvement of facial matching using a 3D mug shot over traditional 2D mug shots. The tests were carried out in both a virtual and a real-world environment, using either a virtual or a 3D-printed 3D model. The outcome is a point cloud, which describes the face. The quantitative analysis of the errors was realized through the distances between the mesh of the acquired 3D model and the point cloud. A total of 80% of the points with a distance of less than ±1 mm was obtained. Finally, the performance on facial recognition of the 3D mug shot is evaluated against the traditional 2D mug shot using the NeoFace Watching software (NeoFACE) with a score increment of up to 0.42 points, especially in scenarios where the suspect is not captured from a frontal view.

Design of a Multi-Vision System for a Three-Dimensional Mug Shot Model to Improve Forensic Facial Identification / Giuliani, Samuele; Tosti, Francesco; Lopes, Pierpaolo; Ciampini, Claudio; Nardinocchi, Carla. - In: APPLIED SCIENCES. - ISSN 2076-3417. - 14:20(2024). [10.3390/app14209285]

Design of a Multi-Vision System for a Three-Dimensional Mug Shot Model to Improve Forensic Facial Identification

Nardinocchi Carla
Ultimo
2024

Abstract

A traditional mug shot is a front and side view of a person from the shoulder up, taken by law enforcement. Forensic science is exploring the benefit of working with 3D data offered by new technologies, and there is an increasing need to work with 3D mug shots. Among the various available techniques, a multi-view photogrammetric approach achieves the highest accuracy in the shortest acquisition time. In this work, a multi-view photogrammetric system for facial reconstruction based on low-cost cameras is developed with the aims of verifying the performance of such cameras for the production of a 3D mug shot with submillimetre accuracy and assessing the improvement of facial matching using a 3D mug shot over traditional 2D mug shots. The tests were carried out in both a virtual and a real-world environment, using either a virtual or a 3D-printed 3D model. The outcome is a point cloud, which describes the face. The quantitative analysis of the errors was realized through the distances between the mesh of the acquired 3D model and the point cloud. A total of 80% of the points with a distance of less than ±1 mm was obtained. Finally, the performance on facial recognition of the 3D mug shot is evaluated against the traditional 2D mug shot using the NeoFace Watching software (NeoFACE) with a score increment of up to 0.42 points, especially in scenarios where the suspect is not captured from a frontal view.
2024
3D model; face matching; mug shot; multi-view photogrammetric system
01 Pubblicazione su rivista::01a Articolo in rivista
Design of a Multi-Vision System for a Three-Dimensional Mug Shot Model to Improve Forensic Facial Identification / Giuliani, Samuele; Tosti, Francesco; Lopes, Pierpaolo; Ciampini, Claudio; Nardinocchi, Carla. - In: APPLIED SCIENCES. - ISSN 2076-3417. - 14:20(2024). [10.3390/app14209285]
File allegati a questo prodotto
File Dimensione Formato  
Nardinocchi_Applied-Science_2024.pdf

accesso aperto

Note: Articolo
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 9.63 MB
Formato Adobe PDF
9.63 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1730197
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact