Xia et al. (2025) Began+: Leveraging bi-temporal SAR-optical data fusion to reconstruct clear-sky satellite imagery under large cloud cover
Identification
- Journal: Remote Sensing of Environment
- Year: 2025
- Date: 2025-12-11
- Authors: Yu Xia, Wei He, Liangpei Zhang, Hongyan Zhang
- DOI: 10.1016/j.rse.2025.115171
Research Groups
- State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan, PR China
- School of Computer Science, China University of Geosciences, Wuhan, PR China
Short Summary
This paper introduces Began+, a novel deep learning framework that fuses bi-temporal SAR and optical data to reconstruct clear-sky satellite imagery, effectively addressing large cloud cover and restoring temporal changes. It demonstrates superior performance in synthesizing high-quality Landsat-8 and Sentinel-2 images and open-sources two global datasets for cloud removal.
Objective
- To develop a robust deep learning-based framework that integrates bi-temporal SAR and optical data to reconstruct clear-sky satellite imagery, specifically addressing challenges posed by large cloud cover, the restoration of temporal changes, and the practical limitations of existing deep models.
Study Configuration
- Spatial Scale: Global (datasets BiS1L8-CR and BiS1S2-CR are globally distributed), large-scale dual-sensor imagery.
- Temporal Scale: Bi-temporal (utilizing SAR and pre-temporal optical inputs), focusing on restoring temporal changes.
Methodology and Data
- Models used: Began+ framework, which includes:
- Began (Bi-output Enhanced Generative Adversarial Network)
- Enhanced Channel-wise Fusion Block (ECFB)
- Multi-scale Depth-wise Convolution Residual Block (MDRB)
- Dual-tasking optimization and co-learning strategy
- Flexible post-processing step (cloud masking and traditional gap-filling algorithms)
- Data sources:
- Synthetic Aperture Radar (SAR) data
- Optical satellite imagery (Landsat-8, Sentinel-2)
- Two custom-built, globally distributed cloud removal datasets: BiS1L8-CR and BiS1S2-CR
Main Results
- The Began+ framework effectively captures bi-temporal change features and reconstructs precise surface information in both Landsat-8 and Sentinel-2 satellite images, even under large cloud cover.
- It exhibits significant advantages from both qualitative and quantitative perspectives compared to the latest solutions and algorithms in both simulated and real experiments.
- The framework enables accurate reconstruction of large-scale dual-sensor imagery under high-ratio cloud cover without strict constraints on input timing.
- It effectively restores changing surfaces and improves the quality of unsupervised vegetation extraction.
Contributions
- Introduction of Began+, a novel two-step deep learning framework for reconstructing clear-sky images under extensive cloud cover using bi-temporal SAR-optical data fusion.
- Design of Began, a bi-output enhanced generative adversarial network, capable of image synthesis while simultaneously detecting land-cover changes.
- Development of an enhanced channel-wise fusion block (ECFB) and a multi-scale depth-wise convolution residual block (MDRB) within the Began model.
- Creation and open-sourcing of two globally distributed cloud removal datasets, BiS1L8-CR and BiS1S2-CR, to support deep learning research.
- Demonstration of a flexible SAR-optical input timing capability, enhancing the framework's practicality for real-world applications.
Funding
- Not specified in the provided text.
Citation
@article{Xia2025Began,
author = {Xia, Yu and He, Wei and Zhang, Liangpei and Zhang, Hongyan},
title = {Began+: Leveraging bi-temporal SAR-optical data fusion to reconstruct clear-sky satellite imagery under large cloud cover},
journal = {Remote Sensing of Environment},
year = {2025},
doi = {10.1016/j.rse.2025.115171},
url = {https://doi.org/10.1016/j.rse.2025.115171}
}
Original Source: https://doi.org/10.1016/j.rse.2025.115171