A general framework for change detection is proposed to analyze multimodal remotely sensed data utilizing the Kronecker product between two data representations (vectors or matrices). The proposed method is sensor independent and provides comparable results to techniques that exist for specific sensors. The proposed fusion technique is a pixel-level approach that incorporates inputs from different modalities, rendering enriched multimodal data representation. Thus, the proposed hybridization procedure helps to assimilate multisensor information in a meaningful manner. A novel change index ( ζ ) is defined for the general multimodal case. This index is then used to quantify the change in bitemporal remotely sensed data. This article explores the usability, consistency, and robustness of the proposed multimodal fusion framework, including the change index, with proper validation on two multimodal cases: 1) the dual-frequency ( C - and L -band) fully polarimetric Danish EMISAR data and 2) the dual-polarimetric synthetic aperture radar and Sentinel-2 multispectral data. Detailed analysis and validation using extensive ground-truth data are presented to establish the proposed framework.

A General Framework for Change Detection Using Multi-modal Remote Sensing Data

Bovolo, Francesca;
2021-01-01

Abstract

A general framework for change detection is proposed to analyze multimodal remotely sensed data utilizing the Kronecker product between two data representations (vectors or matrices). The proposed method is sensor independent and provides comparable results to techniques that exist for specific sensors. The proposed fusion technique is a pixel-level approach that incorporates inputs from different modalities, rendering enriched multimodal data representation. Thus, the proposed hybridization procedure helps to assimilate multisensor information in a meaningful manner. A novel change index ( ζ ) is defined for the general multimodal case. This index is then used to quantify the change in bitemporal remotely sensed data. This article explores the usability, consistency, and robustness of the proposed multimodal fusion framework, including the change index, with proper validation on two multimodal cases: 1) the dual-frequency ( C - and L -band) fully polarimetric Danish EMISAR data and 2) the dual-polarimetric synthetic aperture radar and Sentinel-2 multispectral data. Detailed analysis and validation using extensive ground-truth data are presented to establish the proposed framework.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/328990
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact