Precision Viticulture (PV) is becoming an active and interdisciplinary research field since it requires solving interesting research issues to concretely answer the demands of specific use cases. A challenging problem in this context is the development of automatic methods for yield estimation. Computer vision methods can contribute to the accomplishment of this task, especially those that can replicate what winemakers do manually. In this paper, an automatic artificial intelligence method for grape bunch detection from RGB images is presented. A customized Convolutional Neural Network (CNN) is employed for pointwise classification of image pixels and the dependence of classification results on the type of input color channels and grapes color properties are studied. The advantage of using additional perception-based input features, such as luminance and visual contrast, is also evaluated, as well as the dependence of the method on the choice of the training set in terms of the amount of labeled data. The latter point has a significant impact on the practical use of the method on-site, its usability by non-expert users, and its adaptability to individual vineyards. Experimental results show that a properly trained CNN can discriminate and detect grape bunches even under uncontrolled acquisition conditions and with limited computational load, making the proposed method implementable on smart devices and suitable for on-site and real-time applications.

A perception-guided CNN for grape bunch detection / Bruni, V.; Dominijanni, G.; Vitulano, D.; Ramella, G.. - In: MATHEMATICS AND COMPUTERS IN SIMULATION. - ISSN 0378-4754. - 230:(2025), pp. 111-130. [10.1016/j.matcom.2024.11.004]

A perception-guided CNN for grape bunch detection

Bruni V.;Dominijanni G.;Vitulano D.;
2025

Abstract

Precision Viticulture (PV) is becoming an active and interdisciplinary research field since it requires solving interesting research issues to concretely answer the demands of specific use cases. A challenging problem in this context is the development of automatic methods for yield estimation. Computer vision methods can contribute to the accomplishment of this task, especially those that can replicate what winemakers do manually. In this paper, an automatic artificial intelligence method for grape bunch detection from RGB images is presented. A customized Convolutional Neural Network (CNN) is employed for pointwise classification of image pixels and the dependence of classification results on the type of input color channels and grapes color properties are studied. The advantage of using additional perception-based input features, such as luminance and visual contrast, is also evaluated, as well as the dependence of the method on the choice of the training set in terms of the amount of labeled data. The latter point has a significant impact on the practical use of the method on-site, its usability by non-expert users, and its adaptability to individual vineyards. Experimental results show that a properly trained CNN can discriminate and detect grape bunches even under uncontrolled acquisition conditions and with limited computational load, making the proposed method implementable on smart devices and suitable for on-site and real-time applications.
2025
color opponents; convolutional neural network; grape bunch detection; pixel-wise classification; precision viticulture; visual contrast
01 Pubblicazione su rivista::01a Articolo in rivista
A perception-guided CNN for grape bunch detection / Bruni, V.; Dominijanni, G.; Vitulano, D.; Ramella, G.. - In: MATHEMATICS AND COMPUTERS IN SIMULATION. - ISSN 0378-4754. - 230:(2025), pp. 111-130. [10.1016/j.matcom.2024.11.004]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1727560
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact