Crop type mapping currently represents an important problem in remote sensing. Accurate information on the extent and types of crops derived from remote sensing can help managing and improving agriculture especially for developing countries where such information is scarce. In this paper, high-resolution RGB drone images are the input data for the classification performed using a transfer learning (TL) approach. VGG16 and GoogLeNet, which are pre-trained convolutional neural networks (CNNs) used for classification tasks coming from computer vision, are considered for the mapping of the crop types. Thanks to the transferred knowledge, the proposed models can successfully classify the studied crop types with high overall accuracy for two considered cases, achieving up to almost 83% for the Malawi dataset and up to 90% for the Mozambique dataset. Notably, these results are comparable to the ones achieved by the same deep CNN architectures in many computer vision tasks. With regard to drone data analysis, application of deep CNN is very limited so far due to high requirements on the number of samples needed to train such complicated architectures. Our results demonstrate that the transfer learning is an efficient way to overcome this problem and take full advantage of the benefits of deep CNN architectures for drone-based crop type mapping. Moreover, based on experiments with different TL approaches we show that the number of frozen layers is an important parameter of TL and a fine-tuning of all the CNN weights results in significantly better performance than the approaches that apply fine-tuning only on some numbers of last layers.
Crop type mapping by using transfer learning / Nowakowski, A.; Mrziglod, J.; Spiller, D.; Bonifacio, R.; Ferrari, I.; Mathieu, P. P.; Garcia-Herranz, M.; Kim, D. -H.. - In: INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION. - ISSN 1872-826X. - 98:(2021), pp. 1-12. [10.1016/j.jag.2021.102313]
Crop type mapping by using transfer learning
Spiller D.
Methodology
;
2021
Abstract
Crop type mapping currently represents an important problem in remote sensing. Accurate information on the extent and types of crops derived from remote sensing can help managing and improving agriculture especially for developing countries where such information is scarce. In this paper, high-resolution RGB drone images are the input data for the classification performed using a transfer learning (TL) approach. VGG16 and GoogLeNet, which are pre-trained convolutional neural networks (CNNs) used for classification tasks coming from computer vision, are considered for the mapping of the crop types. Thanks to the transferred knowledge, the proposed models can successfully classify the studied crop types with high overall accuracy for two considered cases, achieving up to almost 83% for the Malawi dataset and up to 90% for the Mozambique dataset. Notably, these results are comparable to the ones achieved by the same deep CNN architectures in many computer vision tasks. With regard to drone data analysis, application of deep CNN is very limited so far due to high requirements on the number of samples needed to train such complicated architectures. Our results demonstrate that the transfer learning is an efficient way to overcome this problem and take full advantage of the benefits of deep CNN architectures for drone-based crop type mapping. Moreover, based on experiments with different TL approaches we show that the number of frozen layers is an important parameter of TL and a fine-tuning of all the CNN weights results in significantly better performance than the approaches that apply fine-tuning only on some numbers of last layers.File | Dimensione | Formato | |
---|---|---|---|
Nowakowski_Crop-Type_2021.pdf
accesso aperto
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Creative commons
Dimensione
840.39 kB
Formato
Adobe PDF
|
840.39 kB | Adobe PDF |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.