Building height estimation is a challenging and essential task in various fields such as disaster risk management, urban planning, and change detection evaluation. However accurate measurements of building heights are challenging due to the different characterization and complexity of urban structures in cities. One key issue to be addressed is whether building height estimation is more effectively performed by considering each pixel as an independent unit or by analyzing the building as an integrated object composed of multiple pixels. This distinction is crucial, as it can substantially impact both the accuracy of results and the practical applications of the analysis. In this paper, we present two deep learning-based methodologies for estimating building heights using a single high-resolution COSMO-SkyMed image. The first methodology employs an Attention-UNet model and functions as a pixel-wise approach, while the second utilizes ResNet101 as its core architecture and operates in an object-based manner. The urban area of Milan, Italy, was selected as the study area for this research. The results indicate that the first methodology achieves a lower Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) compared to the second methodology. The comparative results of these two methodologies are substantial and offer valuable insights for decision-makers, providing a clearer understanding of urban environments. The code used in this study can be made publicly available on GitHub following potential acceptance of the work.

Building Height Estimation from COSMO-SkyMed Imagery through Deep Learning Methods / Memar, B.; Russo, L.; Ullo, S. L.; Gamba, P.. - (2025), pp. 1-4. ( 2025 Joint Urban Remote Sensing Event (JURSE) tun ) [10.1109/JURSE60372.2025.11076010].

Building Height Estimation from COSMO-SkyMed Imagery through Deep Learning Methods

Memar B.;
2025

Abstract

Building height estimation is a challenging and essential task in various fields such as disaster risk management, urban planning, and change detection evaluation. However accurate measurements of building heights are challenging due to the different characterization and complexity of urban structures in cities. One key issue to be addressed is whether building height estimation is more effectively performed by considering each pixel as an independent unit or by analyzing the building as an integrated object composed of multiple pixels. This distinction is crucial, as it can substantially impact both the accuracy of results and the practical applications of the analysis. In this paper, we present two deep learning-based methodologies for estimating building heights using a single high-resolution COSMO-SkyMed image. The first methodology employs an Attention-UNet model and functions as a pixel-wise approach, while the second utilizes ResNet101 as its core architecture and operates in an object-based manner. The urban area of Milan, Italy, was selected as the study area for this research. The results indicate that the first methodology achieves a lower Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) compared to the second methodology. The comparative results of these two methodologies are substantial and offer valuable insights for decision-makers, providing a clearer understanding of urban environments. The code used in this study can be made publicly available on GitHub following potential acceptance of the work.
2025
2025 Joint Urban Remote Sensing Event (JURSE)
CNN; COSMO-SkyMed; deep learning; neural network; ResNet; SAR
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Building Height Estimation from COSMO-SkyMed Imagery through Deep Learning Methods / Memar, B.; Russo, L.; Ullo, S. L.; Gamba, P.. - (2025), pp. 1-4. ( 2025 Joint Urban Remote Sensing Event (JURSE) tun ) [10.1109/JURSE60372.2025.11076010].
File allegati a questo prodotto
File Dimensione Formato  
Memar_Building-height-estimation_2025.pdf

solo gestori archivio

Note: Contributo
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.5 MB
Formato Adobe PDF
3.5 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1755738
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact