In this paper we present a perception system for agriculture robotics that enables an unmanned ground vehicle (UGV) equipped with a multi spectral camera to automatically perform the crop/weed detec- tion and classification tasks in real-time. Our approach exploits a pipeline that includes two different convolutional neural networks (CNNs) applied to the input RGB+near infra-red (NIR) images. A lightweight CNN is used to perform a fast and robust, pixel- wise, binary image segmentation, in order to extract the pixels that rep- resent projections of 3D points that belong to green vegetation. A deeper CNN is then used to classify the extracted pixels between the crop and weed classes. A further important contribution of this work is a novel unsupervised dataset summarization algorithm that automatically selects from a large dataset the most informative subsets that better describe the original one. This enables to streamline and speed-up the manual dataset label- ing process, otherwise extremely time consuming, while preserving good classification performances. Experiments performed on different datasets taken from a real farm robot confirm the effectiveness of our approach.

Fast and accurate crop and weed identification with summarized train sets for precision agriculture / Potena, C.; Nardi, D.; Pretto, A.. - STAMPA. - 531:(2017), pp. 105-121. (Intervento presentato al convegno 14th International Conference on Intelligent Autonomous Systems (IAS-14) tenutosi a Shanghai, China) [10.1007/978-3-319-48036-7_9].

Fast and accurate crop and weed identification with summarized train sets for precision agriculture

Potena, C.
;
Nardi, D.
Primo
;
Pretto, A.
2017

Abstract

In this paper we present a perception system for agriculture robotics that enables an unmanned ground vehicle (UGV) equipped with a multi spectral camera to automatically perform the crop/weed detec- tion and classification tasks in real-time. Our approach exploits a pipeline that includes two different convolutional neural networks (CNNs) applied to the input RGB+near infra-red (NIR) images. A lightweight CNN is used to perform a fast and robust, pixel- wise, binary image segmentation, in order to extract the pixels that rep- resent projections of 3D points that belong to green vegetation. A deeper CNN is then used to classify the extracted pixels between the crop and weed classes. A further important contribution of this work is a novel unsupervised dataset summarization algorithm that automatically selects from a large dataset the most informative subsets that better describe the original one. This enables to streamline and speed-up the manual dataset label- ing process, otherwise extremely time consuming, while preserving good classification performances. Experiments performed on different datasets taken from a real farm robot confirm the effectiveness of our approach.
2017
14th International Conference on Intelligent Autonomous Systems (IAS-14)
agriculture robotics; classification; segmentation; convolu- tional neural networks
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Fast and accurate crop and weed identification with summarized train sets for precision agriculture / Potena, C.; Nardi, D.; Pretto, A.. - STAMPA. - 531:(2017), pp. 105-121. (Intervento presentato al convegno 14th International Conference on Intelligent Autonomous Systems (IAS-14) tenutosi a Shanghai, China) [10.1007/978-3-319-48036-7_9].
File allegati a questo prodotto
File Dimensione Formato  
Potena_Postprint-Fast-and-accurate_2016.pdf

Open Access dal 02/10/2018

Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.18 MB
Formato Adobe PDF
3.18 MB Adobe PDF
Potena_Fast-and-Accurate_2017.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 462.11 kB
Formato Adobe PDF
462.11 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/933191
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 121
  • ???jsp.display-item.citation.isi??? 104
social impact