Zhang et al. (2026) Three-dimensional cloud radar reflectivity reconstruction from geostationary multispectral imagery using a context-aware Transformer
Identification
- Journal: International Journal of Applied Earth Observation and Geoinformation
- Year: 2026
- Date: 2026-04-03
- Authors: Shenglan Zhang, Shihao Zhang, Ying Zhou, Hailei Liu, Minzheng Duan
- DOI: 10.1016/j.jag.2026.105275
Research Groups
- Key Laboratory of Atmospheric Sounding, Chengdu University of Information Technology, Chengdu 610225, China
- Institute of Atmospheric Physics, Chinese Academy of Sciences, Beijing 100029, China
Short Summary
This study develops a novel Transformer-based framework to retrieve continuous three-dimensional (3D) radar reflectivity fields from geostationary satellite imagery, demonstrating robust performance against CloudSat observations (R=0.80, RMSE=6.75 dBZ for composite reflectivity) and utility in monitoring severe weather like hurricanes.
Objective
- To develop and validate a Transformer-based machine learning framework capable of estimating continuous three-dimensional (3D) cloud radar reflectivity structures from two-dimensional passive geostationary multispectral imagery.
Study Configuration
- Spatial Scale: Continental United States (CONUS), Western Hemisphere (GOES-16 full-disk), four major Atlantic hurricanes (Laura, Paulette, Delta, Zeta), and three ground-based ARM sites (Houston, Southern Great Plains, Gunnison-Crested Butte Regional Airport). Vertical profiles span 64 layers with 240 m spacing, covering an altitude range of approximately 0–15.4 km.
- Temporal Scale: Collocated ABI–CloudSat pairs from 2019–2020 for training and validation. Independent validation with ground-based radar in March 2022. Hurricane analysis for 2020 events. Geostationary satellite observations provide data at 10-minute intervals, enabling continuous day-and-night retrieval.
Methodology and Data
- Models used:
- Transformer-based deep learning framework.
- Two-stage retrieval system: a classification Transformer for cloud mask detection and a regression Transformer for radar reflectivity estimation in cloudy layers.
- Context-aware Transformer utilizing a sliding window of 32 adjacent satellite footprints.
- Rotary Positional Embeddings (RoPE) for encoding relative ordering within the window.
- Data sources:
- Geostationary Satellite Imagery: GOES-R Advanced Baseline Imager (ABI) Level-1b radiances (Channels 1, 3–16).
- Spaceborne Cloud Radar: CloudSat 2B-GEOPROF product (radar reflectivity profiles, 64 vertical bins, 240 m spacing).
- Ground-based Cloud Radar: Ka-band ARM Zenith Radar (KAZR) observations from three ARM sites (HOU, SGP, GUC) for independent structural validation.
- Tropical Cyclone Best-Track Data: International Best Track Archive for Climate Stewardship (IBTrACS) for hurricane analysis.
Main Results
- The proposed retrieval framework achieves robust performance for composite reflectivity, with a Pearson correlation coefficient (R) of 0.80 and a root-mean-square error (RMSE) of 6.75 dBZ for the Vis + IR model.
- For layer-wise reflectivity, the Vis + IR model yields R = 0.72 and RMSE = 8.15 dBZ, outperforming the IR-only model (R = 0.66, RMSE = 8.75 dBZ). Both models exhibit negligible mean biases.
- Cloud detection accuracy exceeds 80% at most altitudes, with a Probability of Detection (POD) reaching 70% and a False Alarm Rate (FAR) around 20% between 2 km and 10 km altitude.
- The context-aware Transformer (32-footprint window) significantly improves performance compared to single-pixel input, demonstrating the value of spatial context.
- Retrieval accuracy is highest in the upper troposphere (>6 km), consistent with the satellite's top-down viewing geometry.
- Independent validation against ground-based Ka-band ARM Zenith Radar (KAZR) observations shows high structural similarity (SSIM > 0.872), confirming the model's ability to reproduce realistic vertical cloud morphology across diverse climate regimes.
- The framework successfully captures the three-dimensional evolution of major Atlantic hurricanes, including eyewall consolidation and intensification processes.
- Retrieved composite reflectivity metrics exhibit a strong correlation (R > 0.8) with maximum sustained wind speeds for the analyzed hurricanes (e.g., Laura R=0.87).
Contributions
- Introduces a novel Transformer-based deep learning framework for continuous, high-resolution 3D cloud radar reflectivity reconstruction from geostationary multispectral imagery.
- Addresses the limitations of sparse coverage and narrow swaths of active spaceborne radars by providing wide-field, high-frequency 3D cloud structure information.
- Demonstrates the effectiveness of a context-aware Transformer in leveraging mesoscale cloud organization for improved retrieval accuracy compared to pixel-wise methods.
- Enables continuous day-and-night retrieval of 3D cloud structures, critical for monitoring rapidly evolving weather systems.
- Provides a quantitative basis for severe weather analysis, particularly for hurricane monitoring and intensity estimation, by linking retrieved 3D cloud fields to storm evolution and maximum sustained wind speeds.
- Offers valuable complementary information to sparse active profiling observations, bridging the gap between high temporal resolution imagers and vertical profiling radars.
Funding
- National Natural Science Foundation of China (Grant 42030107)
- National Key R&D Program of China (Grant 2021YFC3090203)
Citation
@article{Zhang2026Threedimensional,
author = {Zhang, Shenglan and Zhang, Shihao and Zhou, Ying and Liu, Hailei and Duan, Minzheng},
title = {Three-dimensional cloud radar reflectivity reconstruction from geostationary multispectral imagery using a context-aware Transformer},
journal = {International Journal of Applied Earth Observation and Geoinformation},
year = {2026},
doi = {10.1016/j.jag.2026.105275},
url = {https://doi.org/10.1016/j.jag.2026.105275}
}
Original Source: https://doi.org/10.1016/j.jag.2026.105275