Satellite image time series are used to solve dynamic tasks in Earth observation due to their rich information content, such as change detection and land cover classification. The amount of information in image time series and the interconnection among tasks may benefit from solving the tasks simultaneously in a unified framework taking advantage of a shared representation. However, in the literature, most of the methods perform a single task only. While methods for multiple tasks deal with single-date or bitemporal images, thus they fail to: first, model temporal dynamics in time series and, second, model temporal correlations across tasks. In this article, we propose a novel multitask deep learning framework to analyze remote sensing image time series (more than two images) in terms of various temporal relationships, handling both short-term and long-term temporal information. The framework aims to learn both task-common and task-specific representations to capture spatio-temporal task interdependencies. To this end, we use feature sharing in a multitask setting with a single encoder and a decoder for each task. We design a novel cross-task selective feature sharing mechanism in order to interexchange spatio-temporal task information. We flow down the framework into a model that deals with two tasks: frame-by-frame abrupt change detection and multitemporal semantic segmentation. The model is tested on three datasets of multispectral image time series, acquired by Landsat-8 (annual series of 12 images), Sentinel-2 (annual series of 12 images), and Gaofen-2 (3 images across years) sensors. Results show the effectiveness of the proposed framework in various settings and demonstrate the validity of the multitask model.

A Multitask Framework With Cross-Task Selective Feature Sharing for Remote Sensing Image Time-Series Analysis

Atanasova, Milena;Bergamasco, Luca;Bovolo, Francesca
2026-01-01

Abstract

Satellite image time series are used to solve dynamic tasks in Earth observation due to their rich information content, such as change detection and land cover classification. The amount of information in image time series and the interconnection among tasks may benefit from solving the tasks simultaneously in a unified framework taking advantage of a shared representation. However, in the literature, most of the methods perform a single task only. While methods for multiple tasks deal with single-date or bitemporal images, thus they fail to: first, model temporal dynamics in time series and, second, model temporal correlations across tasks. In this article, we propose a novel multitask deep learning framework to analyze remote sensing image time series (more than two images) in terms of various temporal relationships, handling both short-term and long-term temporal information. The framework aims to learn both task-common and task-specific representations to capture spatio-temporal task interdependencies. To this end, we use feature sharing in a multitask setting with a single encoder and a decoder for each task. We design a novel cross-task selective feature sharing mechanism in order to interexchange spatio-temporal task information. We flow down the framework into a model that deals with two tasks: frame-by-frame abrupt change detection and multitemporal semantic segmentation. The model is tested on three datasets of multispectral image time series, acquired by Landsat-8 (annual series of 12 images), Sentinel-2 (annual series of 12 images), and Gaofen-2 (3 images across years) sensors. Results show the effectiveness of the proposed framework in various settings and demonstrate the validity of the multitask model.
File in questo prodotto:
File Dimensione Formato  
A_Multitask_Framework_With_Cross-Task_Selective_Feature_Sharing_for_Remote_Sensing_Image_Time-Series_Analysis.pdf

accesso aperto

Tipologia: Documento in Post-print
Licenza: Creative commons
Dimensione 16.5 MB
Formato Adobe PDF
16.5 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/370827
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact