Ling et al. (2026) An improved Hydrology-Informed attention LSTM(HIA-LSTM) model for runoff simulation with seasonal snowmelt
Identification
- Journal: Journal of Hydrology
- Year: 2026
- Date: 2026-03-03
- Authors: Muwu Ling, Yashuo Guan, Yanqing Lian, Xiaonan Sun, Yongliang Gao, Yuling Ren
- DOI: 10.1016/j.jhydrol.2026.135231
Research Groups
- Yangtze Institute for Conservation and Development, Hohai University, Nanjing 210098, China
- The National Key Laboratory of Water Disaster Prevention, Hohai University, Nanjing 210098, China
- College of Hydrology and Water Resources, Hohai University, Nanjing 210098, China
Short Summary
This study proposes a Hydrology-Informed Attention LSTM (HIA-LSTM) that embeds physical inductive biases into its neural architecture to improve runoff simulation in alpine basins with complex cryospheric processes. The HIA-LSTM significantly outperforms conventional deep learning models, achieving superior accuracy and interpretability, especially in melt-driven runoff scenarios.
Objective
- To address the limitations of conventional "black-box" deep learning models and improve accurate runoff simulation in alpine basins of the Tibetan Plateau, considering complex cryospheric processes and strong non-linear interactions among precipitation, snowmelt, and glacier melt, by proposing a Hydrology-Informed Attention LSTM (HIA-LSTM) that embeds physical inductive biases.
Study Configuration
- Spatial Scale: Six headwater watersheds encompassing diverse hydrological regimes (e.g., monsoon-dominated Yangtze sources to westerlies-driven, melt-dominated Amu Darya (ADR-KK)) in alpine basins of the Tibetan Plateau.
- Temporal Scale: Daily runoff simulation.
Methodology and Data
- Models used: Hydrology-Informed Attention LSTM (HIA-LSTM), standard Long Short-Term Memory (LSTM), attention-LSTM, multi-head attention LSTM.
- Data sources: Hydrological and meteorological observations (e.g., precipitation, temperature, runoff).
Main Results
- HIA-LSTM significantly outperforms standard LSTM, attention-LSTM, and multi-head attention LSTM.
- Achieved an average Kling-Gupta Efficiency (KGE) of 0.888 and Nash-Sutcliffe Efficiency (NSE) of 0.892 during the test period.
- In the melt-dominated ADR-KK watershed, the KGE dramatically improved from 0.626 (LSTM) to 0.872.
- The model substantially reduces low-flow overestimation and improves high-flow peak simulation.
- HIA-LSTM maintains stable performance across seasonal transitions.
- Attention weight analysis confirms clear functional specialization aligned with physical processes, enhancing interpretability.
Contributions
- Proposes HIA-LSTM, a novel deep learning model that embeds physical inductive biases (quickflow, slowflow, snowmelt dynamics) directly into its neural architecture.
- Introduces specialized attention heads with temporal masking, logarithmic decay, and temperature-based gating to ensure physically-consistent runoff generation.
- Effectively bridges data-driven and process-based modeling, providing a robust and physically-consistent framework for daily runoff simulation in alpine watersheds.
- Enhances the interpretability and reliability of deep learning models for complex hydrological processes, particularly seasonal snowmelt.
Funding
- Not explicitly mentioned in the provided text.
Citation
@article{Ling2026improved,
author = {Ling, Muwu and Guan, Yashuo and Lian, Yanqing and Sun, Xiaonan and Gao, Yongliang and Ren, Yuling},
title = {An improved Hydrology-Informed attention LSTM(HIA-LSTM) model for runoff simulation with seasonal snowmelt},
journal = {Journal of Hydrology},
year = {2026},
doi = {10.1016/j.jhydrol.2026.135231},
url = {https://doi.org/10.1016/j.jhydrol.2026.135231}
}
Original Source: https://doi.org/10.1016/j.jhydrol.2026.135231