Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deeplearning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.

Discrimination between the facial gestures of vocalising and non-vocalising lemurs and small apes using deep learning / Carugati, Filippo; Friard, Olivier; Protopapa, Elisa; Mancassola, Camilla; Rabajoli, Emanuela; De Gregorio, Chiara; Valente, Daria; Ferrario, Valeria; Cristiano, Walter; Raimondi, Teresa; Torti, Valeria; Lefaux, Brice; Miaretsoa, Longondraza; Giacoma, Cristina; Gamba, Marco. - In: ECOLOGICAL INFORMATICS. - ISSN 1574-9541. - 85:(2025), pp. 1-11. [10.1016/j.ecoinf.2024.102847]

Discrimination between the facial gestures of vocalising and non-vocalising lemurs and small apes using deep learning

Raimondi, Teresa;
2025

Abstract

Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deeplearning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.
2025
Primate face; Indri indri; Propithecus diadema; Nomascus gabriellae; DeepLabCut; Acoustic communication
01 Pubblicazione su rivista::01a Articolo in rivista
Discrimination between the facial gestures of vocalising and non-vocalising lemurs and small apes using deep learning / Carugati, Filippo; Friard, Olivier; Protopapa, Elisa; Mancassola, Camilla; Rabajoli, Emanuela; De Gregorio, Chiara; Valente, Daria; Ferrario, Valeria; Cristiano, Walter; Raimondi, Teresa; Torti, Valeria; Lefaux, Brice; Miaretsoa, Longondraza; Giacoma, Cristina; Gamba, Marco. - In: ECOLOGICAL INFORMATICS. - ISSN 1574-9541. - 85:(2025), pp. 1-11. [10.1016/j.ecoinf.2024.102847]
File allegati a questo prodotto
File Dimensione Formato  
1-s2.0-S1574954124003893-main.pdf

accesso aperto

Note: Carugati_Discrimination_2025
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 6.79 MB
Formato Adobe PDF
6.79 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1743719
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact