• 耕作栽培·生理生化 •

### 基于无人机平台多模态数据融合的小麦产量估算研究

1. 1河南农业大学农学院 / 省部共建小麦玉米作物学国家重点实验室, 河南郑州 450046
2International Maize and Wheat Improvement Center (CIMMYT), Texcoco, Mexico
• 收稿日期:2021-06-02 接受日期:2021-10-19 出版日期:2022-07-12 网络出版日期:2021-11-26
• 通讯作者: 冯伟
• 作者简介:E-mail: 15939266989@163.com
• 基金资助:
国家“十三五”重点研发计划粮食丰产增效科技创新项目(2018YFD0300701);财政部和农业农村部国家现代农业产业技术体系建设专项(CARS-03);河南省科技攻关项目(212102110041)

### Wheat yield estimation from UAV platform based on multi-modal remote sensing data fusion

ZHANG Shao-Hua1(), DUAN Jian-Zhao1,2, HE Li1, JING Yu-Hang1, Urs Christoph Schulthess2,*(), Azam Lashkari1,2, GUO Tian-Cai1, WANG Yong-Hua1, FENG Wei1,*()

1. 1Agronomy College of Henan Agriculture University / State Key Laboratory of Wheat and Maize Crop Science, Zhengzhou 450002, Henan, China
2International Maize and Wheat Improvement Center (CIMMYT), Texcoco, Mexico
• Received:2021-06-02 Accepted:2021-10-19 Published:2022-07-12 Published online:2021-11-26
• Contact: Urs Christoph Schulthess,FENG Wei
• Supported by:
National “13th Five-Year” Key Research and Development Program of China(2018YFD0300701);China Agriculture Research System of MOF and MARA(CARS-03);Key Technologies Research & Development Program of Henan Province, China(212102110041)

Abstract:

Crop yield estimations are important for national food security, people, and the environment. Timely and accurate estimation of crop yield at the field scale is of great significance for crop management, harvest and trade. It ultimately enables farmers to optimize inputs and economic return. We selected an irrigated wheat field in a region near Kaifeng, Henan province, for this study. The terrain in that region is undulating and spatial differences. We used a low-altitude unmanned aerial vehicle (UAV) remote sensing platform equipped with a multi-spectral camera, thermal infrared camera, and RGB camera to simultaneously obtain different remote sensing parameters during the key growth stages of wheat. Based on the extracted spectral reflectivity, thermal infrared temperature, and digital elevation information, we calculated the spatial variability of remote sensing parameters, and growth indices under different terrain characteristics. We also analyzed the correlations between vegetation indices, temperature parameters and wheat yield. By means of four machine learning methods, including multiple linear regression method (MLR), partial least squares regression method (PLSR), support vector machine regression method (SVR), and random forest regression method (RFR), we compared the yield estimation capability of single-modal data versus multimodal data fusion frameworks. The results showed that slope was an important factor affecting crop growth and yield. We observed significant differences in remote sensing parameters under different slope grades. Soil water content, water content of plants, and above-ground biomass at the three growth stages were significantly correlated with slope. Most of the vegetation indices and temperature parameters of three growth stages were significantly correlated with yield as well. Based on the strength of their correlation with yield, seven vegetation indices (NDVI, GNDVI, EVI2, OSAVI, SAVI, NDRE, and WDRVI) and two temperature parameters (NRCT, CTD) were selected as the final input variables for the model. For the single-modal data framework, the model constructed with the vegetation indices was better than the yield model constructed with the temperature parameters, and the highest accuracy was obtained with a RFR model based on vegetation indices at filling stage (R2 = 0.724, RMSE = 614.72 kg hm-2, MAE = 478.08 kg hm-2). For the double modal data fusion approach, the highest accuracy resulted at flowering stage, using the temperature parameters combined with the vegetation indices of RFR model (R2=0.865, RMSE=440.73 kg hm-2, MAE=374.86 kg hm-2). Even higher accuracies were obtained, using the multimodal data fusion approach with a RFR model based on vegetation indices, temperature parameters and slope information at flowering stage (R2 = 0.893, RMSE = 420.06 kg hm-2, MAE = 352.69 kg hm-2), and the highest validation model (R2 = 0.892, RMSE = 423.55 kg hm-2, MAE = 334.43 kg hm-2) for fusion of the flowering stage. The results revealed that by using a multimodal data fusion framework of terrain factors combined with RFR, we can fully exploit the complementary and synergistic roles of different remote sensing information sources. This effectively improves the accuracy and stability of the yield estimation model, and provides a reference and support for crop growth monitoring and yield estimation.