Camera‐traps are valuable tools for estimating wildlife population density, and recently developed models enable density estimation without the need for individual recognition. Still, processing and analysis of camera‐trap data are extremely time‐consuming. While algorithms for automated species classification are becoming more common, they have only served as supporting tools, limiting their true potential in being implemented in ecological analyses without human supervision. Here, we assessed the capability of two camera‐trap based models to provide robust density estimates when image classification is carried out by machine learning algorithms. We simulated density estimation with Camera‐Trap Distance Sampling (CT‐DS) and Random Encounter Model (REM) under different scenarios of automated image classification. We then applied the two models to obtain density estimates of three focal species (roe deer Capreolus capreolus, red fox Vulpes vulpes and Eurasian badger Meles meles) in a reserve in central Italy. Species detection and classification was carried out both by the user and machine learning algorithms (respectively, MegaDetector and Wildlife Insights), and all outputs were used to estimate density and ultimately compared. Simulation results suggested that the CT‐DS model could provide robust density estimates even at poor algorithm performances (down to 50% of correctly classified images), while the REM model is more unpredictable and depends on multiple factors. Density estimates obtained from the MegaDetector output were highly consistent for both models with the manually labelled images. While Wildlife Insights' performance differed greatly between species (recall: badger = 0.15; roe deer = 0.56; fox = 0.75), CT‐DS estimates did not vary significantly; on the contrary, REM systematically overestimated density, with little overlap in standard errors. We conclude that CT‐DS and REM models can be robust to the loss of images when machine learning algorithms are used to identify animals, with the CT‐DS being an ideal candidate for applications in a fully unsupervised framework. We propose guidelines to evaluate when and how to integrate machine learning in the analysis of camera‐trap data for density estimation, further strengthening the applicability of camera‐traps as a cost‐effective method for density estimation in (spatially and temporally) extensive multi‐species monitoring programmes.

Towards an automated protocol for wildlife density estimation using camera‐traps / Zampetti, Andrea; Mirante, Davide; Palencia, Pablo; Santini, Luca. - In: METHODS IN ECOLOGY AND EVOLUTION. - ISSN 2041-210X. - (2024). [10.1111/2041-210X.14450]

Towards an automated protocol for wildlife density estimation using camera‐traps

Andrea Zampetti
Primo
;
Davide Mirante
Secondo
;
Luca Santini
Ultimo
2024

Abstract

Camera‐traps are valuable tools for estimating wildlife population density, and recently developed models enable density estimation without the need for individual recognition. Still, processing and analysis of camera‐trap data are extremely time‐consuming. While algorithms for automated species classification are becoming more common, they have only served as supporting tools, limiting their true potential in being implemented in ecological analyses without human supervision. Here, we assessed the capability of two camera‐trap based models to provide robust density estimates when image classification is carried out by machine learning algorithms. We simulated density estimation with Camera‐Trap Distance Sampling (CT‐DS) and Random Encounter Model (REM) under different scenarios of automated image classification. We then applied the two models to obtain density estimates of three focal species (roe deer Capreolus capreolus, red fox Vulpes vulpes and Eurasian badger Meles meles) in a reserve in central Italy. Species detection and classification was carried out both by the user and machine learning algorithms (respectively, MegaDetector and Wildlife Insights), and all outputs were used to estimate density and ultimately compared. Simulation results suggested that the CT‐DS model could provide robust density estimates even at poor algorithm performances (down to 50% of correctly classified images), while the REM model is more unpredictable and depends on multiple factors. Density estimates obtained from the MegaDetector output were highly consistent for both models with the manually labelled images. While Wildlife Insights' performance differed greatly between species (recall: badger = 0.15; roe deer = 0.56; fox = 0.75), CT‐DS estimates did not vary significantly; on the contrary, REM systematically overestimated density, with little overlap in standard errors. We conclude that CT‐DS and REM models can be robust to the loss of images when machine learning algorithms are used to identify animals, with the CT‐DS being an ideal candidate for applications in a fully unsupervised framework. We propose guidelines to evaluate when and how to integrate machine learning in the analysis of camera‐trap data for density estimation, further strengthening the applicability of camera‐traps as a cost‐effective method for density estimation in (spatially and temporally) extensive multi‐species monitoring programmes.
2024
automatization; camera-traps; density estimation; distance sampling; machine learning; megadetector; random encounter model; species classification
01 Pubblicazione su rivista::01a Articolo in rivista
Towards an automated protocol for wildlife density estimation using camera‐traps / Zampetti, Andrea; Mirante, Davide; Palencia, Pablo; Santini, Luca. - In: METHODS IN ECOLOGY AND EVOLUTION. - ISSN 2041-210X. - (2024). [10.1111/2041-210X.14450]
File allegati a questo prodotto
File Dimensione Formato  
Zampetti_Towards-an-automated_2024.pdf

accesso aperto

Note: Articolo in rivista
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 642.01 kB
Formato Adobe PDF
642.01 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1726050
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact