GF-UNet:A Cropland Extraction Method Based on Attention Gate and Adaptive Feature Fusion

Chuanhu Li, Yunyan Wang

Abstract


Cropland is the strategic support for maintaining national food security, and accurately obtaining farm information is crucial for food production and security. Due to the complexity of agricultural environments in very high-resolution (VHR) remote sensing images, cropland is often interfered with by the type of cropland, the category of crops, and surrounding vegetation, leading to difficulties in extracting cropland and low extraction accuracy. To solve this problem, this article proposed a cropland extraction network called GF-UNet. First, based on the Attention U-Net network, GF-UNet used attention gates (AGs) to enhance the discriminative ability between partially similar features of cropland and non-cropland in complex situation. Next, the adaptive feature fusion module (AFFM) was introduced to integrate multi-scale cropland features, which could enhance the ability of GF-UNet to identify cropland. Finally, the Spatial Feature Extraction Module (SFEM) was introduced into the skip connection to improve the extraction of detailed features in the results. In this paper, GF-2 satellite images of Xuan 'en County, Hubei Province from June to September 2019 were used as research data, and U-Net, PSPNet, DeepLabv3+ and Attention U-Net networks were selected to conduct comparative experiments with GF-UNet. The results shown that the accuracy, F1-score and IoU of GF-UNet were better than other models, which were 91.25%, 92.41% and 84.56%, respectively. At the same time, the influence of SFEM and AFFM on the experimental results were also explored. Compared with the existing methods, GF-UNet was more suitable for cropland extraction in complex scenes and provided a practical method for cropland extraction in complex scenes.


Full Text:

PDF

References


Z Y Dong, J H Li, J Zhang, J Q Yu and S An, “Cultivated land extraction from high-resolution remote sensing images based on BECU-Net model with edge enhancement”, National Remote Sensing Bulletin, vol.27, no.12, pp. 2847-2859, 2023, doi:10.11834/jrs.20222268.

W. Liu, Z. F. Wu, J. C. Luo, "A divided and stratified extraction method of high-resolution remote sensing information for cropland in hilly and mountainous areas based on deep learning", Acta Geodaetica et Cartographica Sinica, vol. 50, pp. 105-116, 2021, doi: 10.11947/j.AGCS.2021.20190448.

F. B. Wu, "China Crop Watch System with Remote Sensing", National Remote Sensing Bulletin, vol. 6, pp. 481-497, 2004, doi: 10.11834/jrs.20040601.

S. S. Panda, M. N. Rao, P. Thenkabail, J.E. Fitzerald, "Remote Sensing Systems—Platforms and Sensors: Aerial, Satellite, UAV, Optical, Radar, and LiDAR", Remotely Sensed Data Characterization, Classification, and Accuracies. CRC Press, pp. 37-92, 2015. ISBN 978-0-429-08939-8.

N. Zhou, P, Yang, C. S. Wei, "Accurate extraction method for cropland in mountainous areas based on field parcel", Transactions of the Chinese Society of Agricultural Engineering, vol. 37, pp. 260-266, 2021, doi: 10.11975/j. issn.1002-6819.2021.19.030.

L. Samaniego, and K. Schulz, "Supervised Classification of Agricultural Land Cover Using a Modified k-NN Technique (MNN) and Landsat Remote Sensing Imagery", Remote Sens, vol. 1, pp. 875-895, 2009, doi: 10.3390/rs1040875.

F. Waldner, "Automated annual cropland mapping using knowledge-based temporal features", ISPRS-J. Photogramm. Remote Sens, vol. 110, pp. 1-13, 2015, doi: 10.1016/j.isprsjprs.2015.09.013.

P. Teluguntla, "A 30-m landsat-derived cropland extent product of Australia and China using random forest machine learning algorithm on Google Earth Engine cloud computing platform", ISPRS-J. Photogramm. Remote Sens, vol. 144, pp. 325-340, 2018, doi: 10.1016/j.isprsjprs.2018.07.017.

X. D. Zhang, "An object-based supervised classification framework for very-high-resolution remote sensing images using convolutional neural networks", Remote Sens. Lett, vol. 9, pp. 373-382, 2018, doi: 10.1080/2150704X.2017.1422873.

P. Yuan, K. Wang, J. Xiao, “High resolution image cropland extraction based on RMAU-Net network model”, Hubei Agricultural Sciences, vol. 62, no.8, pp. 182-188, 2018, doi: 10.14088/j.cnki.issn0439-8114.2023.08.029.

R. S. Khudeyer and N. M. Almoosawi, “ination of machine learning algorithms and esnet50 for Arabic Handwritten Classification, nformatica, vol. 46, no. 9, pp. 39–44, 2023, doi: 0.31449/inf.v46i9.4375

X. C. Zhang, J. F. Huang, T. Ning, “Progress and prospect of cropland extraction from high-resolution remote sensing images”, Geomatics and Information Science of Wuhan University, vol. 48, no. 10, pp. 1582-1590, 2023, doi: 10.13203/j.whugis20230114.

E. M. Aminoff, S. Baror, E. W. Roginek, “Contextual Associations Represented Both in Neural Networks and Human Behavior”, Sci. Rep, vol.12, pp. 5570, 2022, doi: 10.1038/s41598-022-09451-y.

Y. Qing, W. Liu, “Hyperspectral Image Classification Based on Multi-Scale Residual Network with Attention Mechanism”, Remote Sens, vol.13, no.3, pp.335, 2021, doi: 10.3390/rs13030335.

Z Liu, N Li, L Wang, “A multi-angle comprehensive solution based on deep learning to extract cropland information from high-resolution remote sensing images”, Ecological Indicators, vol.141, pp. 108-121, 2022, doi: 10.1016/j.ecolind.2022.108961.

D J Zhang, Y Z Pan, J S Zhang, “A generalized approach based on convolutional neural networks for large area cropland mapping at very high resolution”, Remote Sensing of Environment, vol. 247, pp. 23, 2020, doi: 10.1016/j.rse.2020.111912.

Z Du, J Yang, C Ou, et al, “Smallholder crop area mapped with a semantic segmentation deep learning method”, Remote Sensing, vol. 11, no.7, pp. 888, 2019, doi: 10.3390/rs11070888.

Ronneberger O, FISCHER P, BROX T. “U-net: Convolutional networks for biomedical image segmentation”, 18th Medical Image Computing and Computer-Assisted Intervention(MICCAI). Munich, Germany, 2015: pp 234-241, doi: 10.1007/978-3-319-24574-4_28.

J H Kim, S H Lee, H H Han, “Modified pyramid scene parsing network with deep learning based multi scale attention”, Journal of the Korea Convergence Society, vol. 12, no.11, pp. 45-51, 2021, doi: 10.15207/JKCS.2021.12.11.045.

S S Li, Q Cai, Z Z Li, et al, “Attention-aware invertible hashing network with skip connections”, Pattern Recognition Letters, vol. 138, pp. 556-562, 2020, doi: 10.1016/j.patrec.2020.09.002.

O Oktay, J Schlemper, L L Folgoc, et al, “Attention U-Net: Learning where to look for the pancreas”, arXiv:1804.03999, doi: 10.48550/arXiv.1804.03999.

L. Xia, J. Luo, Y. Sun and H. Yang, "Deep Extraction of Cropland Parcels from Very High-Resolution Remotely Sensed Imagery," 2018 7th International Conference on Agro-geoinformatics (Agro-geoinformatics), Hangzhou, China, 2018, pp. 1-5, doi: 10.1109/Agro-Geoinformatics.2018.8476002.

K W Wu, S R Zhang, Z Xie, “Monocular depth prediction with residual DenseASPP network”, IEEE Access, vol. 8, pp. 129899-129910, 2020, doi: 10.1109/ACCESS.2020.3006704.

R R Liu, F Tao, X T Liu, et al, “RAANet: a residual aspp with attention framework for semantic segmentation of high-resolution remote sensing images”, Remote Sensing, vol. 14, no.13, pp. 18, 2022, doi: 10.3390/rs14133109.

M Y Yu, X X Chen, W Z Zhang, "AGs-Unet: Building Extraction Model for High Resolution Remote Sensing Images Based on Attention Gates U Network", Sensors, vol. 22, pp. 21, 2022, doi:10.3390/s22082932.

W L Yu, B Liu, H Liu, “Building extraction method of remote sensing image based on Attention Gates and R2U-Net”, Geography and Geo-Information Science, vol. 38, no.3, pp. 31-36+42, 2022, doi: 10.3969/j.issn,1672-0504.2022.03.005.

J J Liao, B Zhu, Y L Chang, “Mangrove change dataset of Hainan Island based on Gaofen-2 data (2015-2019)”, Scientific Data in China, vol. 7, no.4, pp. 7-17, 2022, doi: 10.11922/11-6035.noda.2021.0016.zh.

D P Roy, M A Wulder, T. R Loveland, “Landsat-8: science and product vision for terrestrial global change research”, Remote sensing of Environment, vol. 145, pp. 154-172, 2014, doi: 10.1016/j.rse.2014.02.001.

X Jin, Y Xie, X S Wei, et al, “Delving deep into spatial pooling for squeeze-and-excitation networks”, Pattern Recognition, vol. 121, pp. 154-172, 2022, doi: 10.1016/j.patcog.2021.108159.

J Ji, S T Li, J Xiong, “Semantic Image Segmentation with Propagating Deep Aggregation”, IEEE Transactions on Instrumentation and Measurement, vol. 69, no.12, pp. 9732-9742, 2020, doi: 10.1109/TIM.2020.3004902.

Q Zhao, J Liu, Y Li, “Semantic segmentation with attention mechanism for remote sensing images”, IEEE Transactions on Geoscience Remote Sensing, vol. 60, pp. 1-13, 2021, doi: 10.1109/TGRS.2021.3085889.

W Wang, X Tan, P Zhang, et al, “A CBAM based multiscale transformer fusion approach for remote sensing image change detection”, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, vol. 15, pp. 6817-6825, 2022, doi: 10.1109/JSTARS.2022.3198517.

D H Xie, H Xu, X L Xiong, “Cropland Extraction in Southern China from Very High-Resolution Images Based on Deep Learning”, Remote Sensing, vol. 15, no.5, pp. 123-132, 2022, doi: 10.3390/rs15092231.

F Milletari, N Navab, S A Ahmadi, “In V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation”, 2016 Fourth International Conference on 3D Vision (3DV), Stanford, CA, USA, 2016, pp. 565-571, doi: 10.1109/3DV.2016.79.

M Yeung, E Sala, C B Schonlieb, “Unified Focal loss: Generalising Dice and cross entropy-based losses to handle class imbalanced medical image segmentation”, Computerized Medical Imaging and Graphics, vol. 95, pp. 102-120, 2022, doi: 10.1016/j.compmedimag.2021.102026.

L. Q. Nan, "High-resolution cropland extraction in Shandong province using MPSPNet and UNet network", National Remote Sensing Bulletin, vol. 27, pp. 471-491, 2023, doi: 10.11834/jrs.20210478.

Z. Shao, "Emerging Issues in Mapping Urban Impervious Surfaces Using High-Resolution Remote Sensing Images", Remote Sensing, vol. 15, no.10, pp. 2562-2587, 2023, doi: 10.3390/rs15102562.




DOI: https://doi.org/10.31449/inf.v48i7.5691

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.