Gupta et al. (2025) Finetuning AI Foundation Models to Develop Subgrid‐Scale Parameterizations: A Case Study on Atmospheric Gravity Waves
⚠️ Warning: This summary was generated from the abstract only, as the full text was not available.
Identification
- Journal: Journal of Advances in Modeling Earth Systems
- Year: 2025
- Date: 2025-11-01
- Authors: Aman Gupta, Aditi Sheshadri, Sujit Roy, Johannes Schmude, Vishal Gaur, Wei Ji Leong, Manil Maskey, Rahul Ramachandran
- DOI: 10.1029/2025ms005075
Research Groups
NASA, IBM Research
Short Summary
This study introduces a novel approach to developing machine learning parameterizations for small-scale climate processes by fine-tuning a pre-trained AI foundation model, demonstrating its superior performance in capturing atmospheric gravity wave effects for coarse-resolution climate models.
Objective
- To develop a deep learning parameterization for atmospheric gravity waves (GWs) by fine-tuning a pre-trained AI foundation model (Prithvi WxC) and to evaluate its predictive performance for coarse-resolution climate models compared to a machine learning baseline.
Study Configuration
- Spatial Scale: Global atmosphere, with parameterization for coarse-resolution climate models trained using an atmospheric reanalysis with 10 times finer resolution.
- Temporal Scale: Monthly averages and instantaneous evolution.
Methodology and Data
- Models used: Pre-trained encoder-decoder from a 2.3 billion parameter AI foundation model (NASA and IBM Research's Prithvi WxC), fine-tuned deep learning parameterization for atmospheric gravity waves, Attention U-Net (as a machine learning baseline).
- Data sources: Atmospheric reanalysis data (10 times finer resolution than the target coarse-resolution climate model) used to learn gravity wave fluxes.
Main Results
- The fine-tuned foundation model parameterization effectively captures gravity wave effects for a coarse-resolution climate model.
- It exhibits superior predictive performance throughout the atmosphere, including regions excluded during pre-training, compared to the Attention U-Net baseline.
- The performance improvement is quantified by a Hellinger distance of 0.06 for the fine-tuned model, significantly lower than 0.11 for the baseline model.
Contributions
- Introduces a novel methodology for developing machine learning parameterizations of subgrid-scale climate processes by fine-tuning pre-trained AI foundation models.
- Demonstrates the versatility and reusability of foundation models for climate research applications, specifically for atmospheric gravity wave parameterization.
- Highlights the potential for creating observations-driven and physically accurate parameterizations for a broader range of Earth system processes using this approach.
Funding
Not specified in the provided text.
Citation
@article{Gupta2025Finetuning,
author = {Gupta, Aman and Sheshadri, Aditi and Roy, Sujit and Schmude, Johannes and Gaur, Vishal and Leong, Wei Ji and Maskey, Manil and Ramachandran, Rahul},
title = {Finetuning AI Foundation Models to Develop Subgrid‐Scale Parameterizations: A Case Study on Atmospheric Gravity Waves},
journal = {Journal of Advances in Modeling Earth Systems},
year = {2025},
doi = {10.1029/2025ms005075},
url = {https://doi.org/10.1029/2025ms005075}
}
Original Source: https://doi.org/10.1029/2025ms005075