Yang et al. (2026) Enhancing Multi-Step Ahead Daily Runoff Prediction via HydMoE Model with Local-Global Hybrid Attention
Identification
- Journal: Water Resources Management
- Year: 2026
- Date: 2026-02-23
- Authors: Peilin Yang, Daoyi Chen
- DOI: 10.1007/s11269-026-04502-9
Research Groups
- Tsinghua Shenzhen International Graduate School, Tsinghua University, Shenzhen, Guangdong Province, China
Short Summary
This study proposes HydMoE, a deep learning model integrating a Mixture of Experts architecture with Time2Vec temporal embedding and Local-Global Hybrid Attention, to enhance multi-step ahead daily runoff prediction and provide interpretability for diverse hydrological patterns. It achieves superior performance over baselines for 1 to 7-day lead times on the CAMELS dataset.
Objective
- To develop a deep learning model (HydMoE) that enhances multi-step ahead daily runoff prediction accuracy and interpretability by effectively representing complex, scenario-dependent hydrological processes, addressing limitations in existing models regarding interpretability, process representation, and reliable multi-step forecasting.
Study Configuration
- Spatial Scale: 507 river basins across the contiguous United States, selected from the CAMELS dataset.
- Temporal Scale: Daily runoff prediction with lead times from 1 to 7 days, using data from October 1, 1980, to December 31, 2014.
Methodology and Data
- Models used: HydMoE (a deep learning framework integrating a Mixture of Experts (MoE) architecture, Time2Vec temporal embedding, and Local-Global Hybrid Attention (LGHA)). Baselines for comparison included Attention+LSTM and TCN+Transformer.
- Data sources: CAMELS dataset (Catchment Attributes and Meteorology for Large-sample Studies), comprising daily meteorological forcings, catchment attributes, and observed runoff records. Observed runoff was normalized to daily runoff depth (millimeters per day).
Main Results
- HydMoE consistently demonstrated superior performance over baseline models (Attention+LSTM, TCN+Transformer) across all forecast horizons (1 to 7 days).
- For 1-day ahead prediction, HydMoE achieved a mean Nash-Sutcliffe Efficiency (NSE) of 0.762, outperforming Attention+LSTM (0.721) and TCN+Transformer (0.711).
- HydMoE reduced the mean Mean Absolute Error (MAE) by approximately 6% to 7% and the mean Mean Squared Error (MSE) by about 15% to 35% compared to baselines for 1-day ahead prediction.
- For lead times of 2 to 7 days, HydMoE achieved an average NSE of 0.393, surpassing Attention+LSTM (0.356) and TCN+Transformer (0.362).
- At the 7-day forecast mark, HydMoE maintained an NSE of 0.223, significantly exceeding the 0.155 and 0.186 of the baseline models.
- The MoE architecture provided interpretability by showing expert activation patterns correlated with hydrological scenarios: Experts 3, 4, and 9 for baseflow/low-precipitation; Experts 8 and 11 for moderate precipitation; and Expert 15 for extreme precipitation/flood responses.
Contributions
- Proposes HydMoE, a novel deep learning framework that integrates a Mixture of Experts architecture, Time2Vec temporal embedding, and Local-Global Hybrid Attention for multi-step ahead daily runoff prediction.
- Achieves superior predictive accuracy and robustness over established deep learning baselines across lead times of 1 to 7 days.
- Enhances model interpretability by demonstrating a clear correspondence between expert activation behaviors within the MoE framework and distinct hydrological scenarios, offering insights into dominant runoff mechanisms.
- Provides a robust, data-driven solution for handling watershed heterogeneity, serving as a physically-informed alternative to traditional region-specific calibration in large-scale hydrological modeling.
Funding
- Shenzhen Science and Technology Innovation Committee (Grant No. KCXFZ20240903093900002)
Citation
@article{Yang2026Enhancing,
author = {Yang, Peilin and Chen, Daoyi},
title = {Enhancing Multi-Step Ahead Daily Runoff Prediction via HydMoE Model with Local-Global Hybrid Attention},
journal = {Water Resources Management},
year = {2026},
doi = {10.1007/s11269-026-04502-9},
url = {https://doi.org/10.1007/s11269-026-04502-9}
}
Original Source: https://doi.org/10.1007/s11269-026-04502-9