Flooding is one of the most serious natural hazards worldwide, causing significant and often irreversible damage to economies, infrastructure, and human health in urban areas. The rising frequency of flood events and concerns about the likelihood of future occurrences underscore the urgent need for thorough investigations into flood dynamics. Creating a flood hazard area map is the first critical step in evaluating flood-related damage. In this work, a new flood mapping methodology is presented, which integrates multiple datasets. In this methodology, optical and radar remote sensing data are fused, utilizing a deep learning capsule network. Thanks to the multi-dimensional/multi-scale kernels and feature extraction in vector form within the suggested capsule network, notable outcomes are achieved. The proposed flood mapping methodology is implemented in three main steps: (1) fusing radar and optical data, (2) utilizing a capsule deep learning network to preserve the relationships among features extracted from the convolutional neural network (CNN), and (3) applying a mechanism to enhance the accuracy of flood zoning. The suggested method is utilized for flood mapping in the Gorganrood watershed in northern Iran, employing Sentinel-1 and Sentinel-2 imagery. The results were rigorously compared to established methods, including CNN and weighted CNN ensemble (WCNNE). The results demonstrated that the suggested method outperformed CNN by 6.96% and WCNNE by 3.72% in overall accuracy and provided a remarkable area under the receiver operating curve of 98.34%. It was also observed that incorporating the attention mechanism slightly increased the accuracy of the proposed network by 0.23%. The findings suggest that the proposed methodology has considerable potential for accurate floodwater mapping globally.
A capsule network framework for flood mapping integrating remote sensing fusion techniques / Ahmadi, Pouya; Valadan Zoej, Mohammad Javad; Mokhtarzade, Mehdi; Kardan Halvaie, Nazila; Ghaderpour, Ebrahim. - In: ENVIRONMENTAL RESEARCH COMMUNICATIONS. - ISSN 2515-7620. - 7:6(2025). [10.1088/2515-7620/ade6d0]
A capsule network framework for flood mapping integrating remote sensing fusion techniques
Ghaderpour, EbrahimUltimo
2025
Abstract
Flooding is one of the most serious natural hazards worldwide, causing significant and often irreversible damage to economies, infrastructure, and human health in urban areas. The rising frequency of flood events and concerns about the likelihood of future occurrences underscore the urgent need for thorough investigations into flood dynamics. Creating a flood hazard area map is the first critical step in evaluating flood-related damage. In this work, a new flood mapping methodology is presented, which integrates multiple datasets. In this methodology, optical and radar remote sensing data are fused, utilizing a deep learning capsule network. Thanks to the multi-dimensional/multi-scale kernels and feature extraction in vector form within the suggested capsule network, notable outcomes are achieved. The proposed flood mapping methodology is implemented in three main steps: (1) fusing radar and optical data, (2) utilizing a capsule deep learning network to preserve the relationships among features extracted from the convolutional neural network (CNN), and (3) applying a mechanism to enhance the accuracy of flood zoning. The suggested method is utilized for flood mapping in the Gorganrood watershed in northern Iran, employing Sentinel-1 and Sentinel-2 imagery. The results were rigorously compared to established methods, including CNN and weighted CNN ensemble (WCNNE). The results demonstrated that the suggested method outperformed CNN by 6.96% and WCNNE by 3.72% in overall accuracy and provided a remarkable area under the receiver operating curve of 98.34%. It was also observed that incorporating the attention mechanism slightly increased the accuracy of the proposed network by 0.23%. The findings suggest that the proposed methodology has considerable potential for accurate floodwater mapping globally.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


