Classification of ceramic thin sections is fundamental for understanding ancient pottery production techniques, provenance, and trade networks. Although effective, traditional petrographic analysis is time-consuming. This study explores the application of deep learning models, specifically convolutional neural networks (CNNs) and vision transformers (ViTs), as complementary tools to support the classification of Levantine ceramics based on their petrographic fabrics. A dataset of 1424 thin section images from 178 ceramic samples belonging to several archaeological sites across the Levantine area, mostly from the Bronze Age, with few samples dating to the Iron Age, was used to train and evaluate these models. The results demonstrate that transfer learning significantly improves classification performance, with a ResNet18 model achieving 92.11% accuracy and a ViT reaching 88.34%. Explainability techniques, including Guided Gradient-based Class Activation Maps and attention maps, were applied to interpret and visualize the models’ decisions, revealing that both CNNs and ViTs successfully focus on key mineralogical features for the classification of the samples into their respective petrographic fabrics. These findings highlight the potential of explainable AI in archaeometric studies, providing a reproducible and efficient methodology for ceramic analysis while maintaining transparency in model decision-making.

Interpretable classification of Levantine ceramic thin sections via neural networks / Capriotti, Sara; Devoto, Alessio; Scardapane, Simone; Mignardi, Silvano; Medeghini, Laura. - In: MACHINE LEARNING: SCIENCE AND TECHNOLOGY. - ISSN 2632-2153. - 6:2(2025). [10.1088/2632-2153/ade6c4]

Interpretable classification of Levantine ceramic thin sections via neural networks

Sara Capriotti
;
Alessio Devoto;Simone Scardapane;Silvano Mignardi;Laura Medeghini
2025

Abstract

Classification of ceramic thin sections is fundamental for understanding ancient pottery production techniques, provenance, and trade networks. Although effective, traditional petrographic analysis is time-consuming. This study explores the application of deep learning models, specifically convolutional neural networks (CNNs) and vision transformers (ViTs), as complementary tools to support the classification of Levantine ceramics based on their petrographic fabrics. A dataset of 1424 thin section images from 178 ceramic samples belonging to several archaeological sites across the Levantine area, mostly from the Bronze Age, with few samples dating to the Iron Age, was used to train and evaluate these models. The results demonstrate that transfer learning significantly improves classification performance, with a ResNet18 model achieving 92.11% accuracy and a ViT reaching 88.34%. Explainability techniques, including Guided Gradient-based Class Activation Maps and attention maps, were applied to interpret and visualize the models’ decisions, revealing that both CNNs and ViTs successfully focus on key mineralogical features for the classification of the samples into their respective petrographic fabrics. These findings highlight the potential of explainable AI in archaeometric studies, providing a reproducible and efficient methodology for ceramic analysis while maintaining transparency in model decision-making.
2025
levantine ceramics; materials science; deep learning; explainable AI
01 Pubblicazione su rivista::01a Articolo in rivista
Interpretable classification of Levantine ceramic thin sections via neural networks / Capriotti, Sara; Devoto, Alessio; Scardapane, Simone; Mignardi, Silvano; Medeghini, Laura. - In: MACHINE LEARNING: SCIENCE AND TECHNOLOGY. - ISSN 2632-2153. - 6:2(2025). [10.1088/2632-2153/ade6c4]
File allegati a questo prodotto
File Dimensione Formato  
Capriotti_Interpretable-classification_2025.pdf

accesso aperto

Note: DOI 10.1088/2632-2153/ade6c4
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 6.69 MB
Formato Adobe PDF
6.69 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1746780
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact