Chen et al. (2026) An explainable correction and fusion framework for global bare-earth DTM generation in mountain areas
Identification
- Journal: Remote Sensing of Environment
- Year: 2026
- Date: 2026-03-11
- Authors: Jun Chen, Liyang Xiong, Guoan Tang
- DOI: 10.1016/j.rse.2026.115364
Research Groups
- State Key Laboratory of Climate System Prediction and Risk Management, Nanjing Normal University, Nanjing, China
- School of Geography, Nanjing Normal University, Nanjing, China
- Key Laboratory of Virtual Geographic Environment, Ministry of Education, Nanjing Normal University, Nanjing, China
- Jiangsu Center for Collaborative Innovation in Geographical Information Resource Development and Application, Nanjing Normal University, Nanjing, China
Short Summary
This study developed an explainable correction and fusion framework to generate high-accuracy global bare-earth Digital Terrain Models (DTMs) in mountainous regions, addressing height biases in existing Digital Surface Models (DSMs). The proposed framework, combining AutoML-SHAP, a CNN-Transformer, and a fusion model, achieved significant vertical accuracy improvements (43.13%–76.86%) over current GDEMs and correction methods.
Objective
- To develop an explainable correction and fusion framework for generating high-accuracy global bare-earth Digital Terrain Models (DTMs) in mountainous regions, by effectively removing height biases caused by vegetation and buildings in existing global digital elevation models (GDEMs).
Study Configuration
- Spatial Scale: Global, with a focus on mountainous regions. Model development utilized 120 globally distributed 1° × 1° high-resolution bare-earth DTM tiles. The generated DTMs have a resolution of 1 arcsecond (approximately 30 meters).
- Temporal Scale: The study focuses on generating a static bare-earth DTM using contemporary global elevation models and recent altimetry data (ICESat-2).
Methodology and Data
- Models used:
- Explainable automated machine learning–Shapley additive explanations (AutoML–SHAP) framework for evaluating and optimizing 35 prediction features, identifying 15 effective features.
- Hybrid convolutional neural network (CNN)–Transformer model for correcting GDEM biases.
- Fusion model to integrate corrected results with multi-source global bare-earth DTMs (GDTMs) for improved accuracy outside training regions.
- Data sources:
- 120 globally distributed 1° × 1° high-resolution bare-earth DTM tiles (for model development).
- 1-arcsecond Copernicus DEM (CopDEM) as the target GDEM.
- Globally distributed Ice, Cloud, and Land Elevation Satellite-2 (ICESat-2) altimetry.
- Multi-source global digital elevation models (GDEMs) such as SRTM and ALOS AW3D30 DEM.
Main Results
- The optimized features selected by the AutoML-SHAP framework substantially outperformed features chosen by traditional methods and those used in previous studies.
- The proposed CNN-Transformer correction model achieved average relative improvements in vertical accuracy ranging from 43.13% to 76.86% over the Copernicus DEM.
- The correction model surpassed eight other correction models and three existing GDTM products in terms of vertical accuracy and spatial consistency across diverse validation datasets.
- The fusion model consistently improved correction accuracy, particularly in regions located outside the initial training areas.
Contributions
- Integrated an explainable model (AutoML-SHAP), advanced deep learning techniques (CNN-Transformer), and fusion strategies with multi-source elevation datasets.
- Developed a novel framework for generating high-accuracy bare-earth global Digital Terrain Models (GDTMs) specifically for challenging mountainous regions worldwide.
- Provided valuable insights for future terrain modeling and geospatial analysis by addressing a critical limitation in existing GDEMs.
Funding
Not provided in the given paper text.
Citation
@article{Chen2026explainable,
author = {Chen, Jun and Xiong, Liyang and Tang, Guoan},
title = {An explainable correction and fusion framework for global bare-earth DTM generation in mountain areas},
journal = {Remote Sensing of Environment},
year = {2026},
doi = {10.1016/j.rse.2026.115364},
url = {https://doi.org/10.1016/j.rse.2026.115364}
}
Original Source: https://doi.org/10.1016/j.rse.2026.115364