Yan et al. (2026) SSF-TransUnet: Fine-Grained Crop Classification via Cross-Source Spatial Spectral Fusion
⚠️ Warning: This summary was generated from the abstract only, as the full text was not available.
Identification
- Journal: Remote Sensing
- Year: 2026
- Date: 2026-03-30
- Authors: Jian Yan, Xueke Chen, Rongrong Ren, Xiaofei Mi, Zhanliang Yuan, Jian Yang, Xianhong Meng, Zhenzhao Jiang, Hongbo Zhu, Yi Liu
- DOI: 10.3390/rs18071034
Research Groups
Not explicitly provided in the given paper text.
Short Summary
This paper proposes SSF-TransUnet, a dual-branch deep learning framework, to address the challenge of fine-grained crop classification by effectively fusing high spatial resolution imagery and multi-spectral observations from different satellite sensors. The method achieves an overall accuracy of 81.84% and a mean Intersection over Union of 0.6954, demonstrating superior performance in distinguishing various crop categories.
Objective
- To develop a novel deep learning framework, SSF-TransUnet, that explicitly decouples spatial structure extraction and spectral discriminability learning to effectively fuse high spatial resolution imagery and multi-spectral observations from different satellite sensors for fine-grained crop classification.
Study Configuration
- Spatial Scale: Meter-level resolution for agricultural regions in northern China, covering five representative agricultural regions and five crop categories (corn, rice, wheat, potato, and others).
- Temporal Scale: Data acquired from different satellite sensors (GF-2, Sentinel-2) for specific observation periods, with the potential for future integration of multi-temporal observations.
Methodology and Data
- Models used: SSF-TransUnet (a dual-branch spatial–spectral joint modeling framework), compared against representative CNN-based and hybrid CNN–Transformer models.
- Data sources:
- GF-2 imagery (meter-level spatial resolution).
- Sentinel-2 data (multi-spectral observations).
- SSCR-Agri dataset: A newly constructed spatial–spectral complementary resolution agricultural dataset integrating GF-2 and Sentinel-2 data from five agricultural regions in northern China.
Main Results
- SSF-TransUnet consistently outperforms representative CNN-based and hybrid CNN–Transformer models in fine-grained crop classification.
- The proposed method achieved an overall accuracy (OA) of 81.84%.
- The proposed method achieved a mean Intersection over Union (mIoU) of 0.6954.
- Effectively distinguished five crop categories: corn, rice, wheat, potato, and others.
- The results highlight the effectiveness of spatial–spectral joint modeling for high-resolution crop mapping.
Contributions
- Proposes SSF-TransUnet, a novel dual-branch spatial–spectral joint modeling framework for fine-grained crop classification using multi-source remote sensing data with varying resolutions.
- Explicitly decouples spatial structure extraction and spectral discriminability learning within a unified architecture to address cross-source spatial–spectral fusion challenges.
- Constructs SSCR-Agri, a new spatial–spectral complementary resolution agricultural dataset integrating meter-level GF-2 imagery and multi-spectral Sentinel-2 data.
- Demonstrates the superior performance of SSF-TransUnet over existing CNN-based and hybrid CNN–Transformer models.
- Highlights the potential of spatial–spectral joint modeling for precision agriculture and large-scale agricultural monitoring applications, with promising mechanisms for multi-temporal observations.
Funding
Not explicitly provided in the given paper text.
Citation
@article{Yan2026SSFTransUnet,
author = {Yan, Jian and Chen, Xueke and Ren, Rongrong and Mi, Xiaofei and Yuan, Zhanliang and Yang, Jian and Meng, Xianhong and Jiang, Zhenzhao and Zhu, Hongbo and Liu, Yi},
title = {SSF-TransUnet: Fine-Grained Crop Classification via Cross-Source Spatial Spectral Fusion},
journal = {Remote Sensing},
year = {2026},
doi = {10.3390/rs18071034},
url = {https://doi.org/10.3390/rs18071034}
}
Original Source: https://doi.org/10.3390/rs18071034