Precision agriculture is gaining increasing attention because of the possible reduction of agricultural inputs (e.g., fertilizers and pesticides) that can be obtained by using hightech equipment, including robots. In this paper, we focus on an agricultural robotics system that addresses the weeding problem by means of selective spraying or mechanical removal of the detected weeds. In particular, we describe a deep learning based method to allow a robot to perform an accurate weed/crop classification using a sequence of two Convolutional Neural Networks (CNNs) applied to RGB images. The first network, based on an encoder-decoder segmentation architecture, performs a pixelwise, plant-type agnostic, segmentation between vegetation and soil that enables to extract a set of connected blobs representing plant instances. We show that such network can be trained also using external, ready to use pixel-wise labeled data sets coming from different contexts. Each plant is hence classified between crop and weeds by using the second network. Quantitative experimental results, obtained on real world data, demonstrate that the proposed approach can achieve good classification results also on challenging images.

Crop and Weeds Classification for Precision Agriculture Using Context-Independent Pixel-Wise Segmentation / Fawakherji, Mulham; Youssef, Ali; Bloisi, Domenico; Pretto, Alberto; Nardi, Daniele. - (2019), pp. 146-152. (Intervento presentato al convegno 3rd IEEE International Conference on Robotic Computing, IRC 2019 tenutosi a Napoli; Italy) [10.1109/IRC.2019.00029].

Crop and Weeds Classification for Precision Agriculture Using Context-Independent Pixel-Wise Segmentation

FAWAKHERJI, MULHAM;Youssef, Ali;Bloisi, Domenico
;
Pretto, Alberto;Nardi, Daniele
2019

Abstract

Precision agriculture is gaining increasing attention because of the possible reduction of agricultural inputs (e.g., fertilizers and pesticides) that can be obtained by using hightech equipment, including robots. In this paper, we focus on an agricultural robotics system that addresses the weeding problem by means of selective spraying or mechanical removal of the detected weeds. In particular, we describe a deep learning based method to allow a robot to perform an accurate weed/crop classification using a sequence of two Convolutional Neural Networks (CNNs) applied to RGB images. The first network, based on an encoder-decoder segmentation architecture, performs a pixelwise, plant-type agnostic, segmentation between vegetation and soil that enables to extract a set of connected blobs representing plant instances. We show that such network can be trained also using external, ready to use pixel-wise labeled data sets coming from different contexts. Each plant is hence classified between crop and weeds by using the second network. Quantitative experimental results, obtained on real world data, demonstrate that the proposed approach can achieve good classification results also on challenging images.
2019
3rd IEEE International Conference on Robotic Computing, IRC 2019
crop weed classification; deep learning; precision agriculture; robot vision; Artificial Intelligence
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Crop and Weeds Classification for Precision Agriculture Using Context-Independent Pixel-Wise Segmentation / Fawakherji, Mulham; Youssef, Ali; Bloisi, Domenico; Pretto, Alberto; Nardi, Daniele. - (2019), pp. 146-152. (Intervento presentato al convegno 3rd IEEE International Conference on Robotic Computing, IRC 2019 tenutosi a Napoli; Italy) [10.1109/IRC.2019.00029].
File allegati a questo prodotto
File Dimensione Formato  
Fawakherji_Crop-and-Weeds_2019.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.76 MB
Formato Adobe PDF
1.76 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1261462
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 105
  • ???jsp.display-item.citation.isi??? 71
social impact