The authors devise a new data fusion technique for classification that exploits the information contained in single-channel SAR and optical images optimally. In fact, even though good classification performance can be achieved by using multiple optical channels, it is shown that classification results from a single optical channel are not appreciably better than those obtained with a single SAR image. The impact of fusion of SAR and optical images is investigated quantitatively. To characterise the limits that can be achieved, both lower and upper bounds to classification performance are introduced, corresponding respectively to single-pixel classification and joint classification of all pixels in the regions defined by the ground truth. First, an optimised technique for single-channel image segmentation followed by maximum likelihood (ML) classification is proposed. A significant performance improvement is demonstrated by classifying the homogeneous regions identified by segmentation, instead of single pixels or even small windows of 3 x 3 pixels. Indeed, the result consistently approaches the upper bound. Next, the proposed ML segmentation technique is extended to exploit the information available in SAR and optical images jointly to define the best set of segments. The segmented SAR and optical images are then used together to obtain the best possible classification of the segments, yielding a significant performance improvement with respect to using either sensor alone. Classification performance is again significantly better than with single pixels, or small windows, and approaches the upper bound.

Segmentation-based joint classification of SAR and optical images / T., Macri Pellizzeri; C. J., Oliver; Lombardo, Pierfrancesco. - In: IEE PROCEEDINGS. RADAR, SONAR AND NAVIGATION. - ISSN 1350-2395. - 149:6(2002), pp. 281-296. [10.1049/ip-rsn:20020714]

Segmentation-based joint classification of SAR and optical images

LOMBARDO, Pierfrancesco
2002

Abstract

The authors devise a new data fusion technique for classification that exploits the information contained in single-channel SAR and optical images optimally. In fact, even though good classification performance can be achieved by using multiple optical channels, it is shown that classification results from a single optical channel are not appreciably better than those obtained with a single SAR image. The impact of fusion of SAR and optical images is investigated quantitatively. To characterise the limits that can be achieved, both lower and upper bounds to classification performance are introduced, corresponding respectively to single-pixel classification and joint classification of all pixels in the regions defined by the ground truth. First, an optimised technique for single-channel image segmentation followed by maximum likelihood (ML) classification is proposed. A significant performance improvement is demonstrated by classifying the homogeneous regions identified by segmentation, instead of single pixels or even small windows of 3 x 3 pixels. Indeed, the result consistently approaches the upper bound. Next, the proposed ML segmentation technique is extended to exploit the information available in SAR and optical images jointly to define the best set of segments. The segmented SAR and optical images are then used together to obtain the best possible classification of the segments, yielding a significant performance improvement with respect to using either sensor alone. Classification performance is again significantly better than with single pixels, or small windows, and approaches the upper bound.
2002
01 Pubblicazione su rivista::01a Articolo in rivista
Segmentation-based joint classification of SAR and optical images / T., Macri Pellizzeri; C. J., Oliver; Lombardo, Pierfrancesco. - In: IEE PROCEEDINGS. RADAR, SONAR AND NAVIGATION. - ISSN 1350-2395. - 149:6(2002), pp. 281-296. [10.1049/ip-rsn:20020714]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/45509
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 17
  • ???jsp.display-item.citation.isi??? 8
social impact