Search

CN-121982516-A - Multi-modal sensor fusion-based crop phenotype rapid identification system and method

CN121982516ACN 121982516 ACN121982516 ACN 121982516ACN-121982516-A

Abstract

The invention discloses a crop phenotype rapid identification system and method based on multi-modal sensor fusion, the system comprises a mobile bearing platform, a multi-mode data acquisition module, a main control and data processing module, a wireless communication module, a data analysis module and a result output module. According to the invention, through integrating the visible light imaging, multispectral imaging, three-dimensional sensing and other multi-mode sensors, and constructing a matched data fusion and intelligent analysis system, the rapid, accurate and multi-parameter synchronous identification of crop phenotype information is realized, the fusion of multi-source data effectively makes up the defect of a single sensing mode, the comprehensiveness and accuracy of phenotype parameter extraction are obviously improved, the adaptability of the system in a variable environment is enhanced, the operation efficiency is greatly improved by an automatic and non-contact acquisition and processing flow, the labor cost and subjective error are reduced, and the system is suitable for large-scale field high-throughput phenotype screening.

Inventors

  • TAO TING
  • SHAO LINGYU
  • ZHANG WEI
  • LI YURU
  • JIANG XIANGTAI
  • YU JIAWEN
  • Chang Chuanhao

Assignees

  • 北京国科廪科技有限公司

Dates

Publication Date
20260505
Application Date
20251215

Claims (10)

  1. 1. The crop phenotype rapid identification system based on the multi-mode sensor fusion is characterized by comprising a mobile bearing platform, a multi-mode data acquisition module, a main control and data processing module, a wireless communication module, a data analysis module and a result output module; The movable bearing platform is used for carrying the multi-mode data acquisition module to move above or beside a crop area to be detected; The multi-mode data acquisition module is fixedly arranged on the mobile bearing platform and is used for synchronously acquiring original phenotype sensing data of the crop area to be detected, and the multi-mode data acquisition module at least comprises a visible light imaging unit, a multi-spectrum imaging unit and a three-dimensional sensing unit; The main control and data processing module is arranged on the mobile bearing platform or the remote server and is used for cooperatively controlling the working states of the mobile bearing platform and the multi-mode data acquisition module and carrying out preliminary preprocessing and caching on the acquired original phenotype sensing data; the wireless communication module is in communication connection with the main control and data processing module and is used for realizing data interaction and instruction transmission between the modules and between the system and the remote monitoring terminal; The data analysis module is deployed on a local embedded device or a remote cloud server, receives the multi-mode data preprocessed by the main control and data processing module, performs feature extraction and fusion analysis on the multi-mode data, and identifies and calculates one or more target phenotype parameters of crops based on a preset or trained phenotype identification model; the result output module is in communication connection with the data analysis module and is used for visually displaying, storing or transmitting the identified target phenotype parameters to a designated terminal.
  2. 2. The rapid identification system of crop phenotype based on multi-modal sensor fusion according to claim 1, wherein the visible light imaging unit is a high-resolution RGB camera for acquiring morphological structure and color texture information of crops, the multi-spectral imaging unit is a narrow-band multi-spectral camera comprising at least three different central wave bands for acquiring spectral reflection information of the crops in visible light and near infrared wave bands, and the three-dimensional sensing unit is a depth camera or a laser radar based on a time-of-flight principle or a structured light principle for acquiring three-dimensional point cloud data of the crops to reconstruct the spatial structure thereof in the system.
  3. 3. The rapid crop phenotype recognition system based on multi-modal sensor fusion according to claim 2, wherein the primary preprocessing of the original phenotype sensing data by the main control and data processing module at least comprises color correction, denoising and geometric registration of image data from the visible light imaging unit and the multi-spectrum imaging unit, and filtering, denoising and coordinate unification of three-dimensional point cloud data from the three-dimensional sensing unit.
  4. 4. The rapid crop phenotype recognition system based on multi-modal sensor fusion according to claim 3 is characterized in that the data analysis module comprises a feature extraction sub-module, a data fusion sub-module and a model recognition sub-module, wherein the feature extraction sub-module is used for extracting visual features, spectral features and morphological structure features related to the target phenotype parameters from preprocessed visible light images, multispectral images and three-dimensional point cloud data respectively, the data fusion sub-module is used for fusing the visual features, the spectral features and the morphological structure features on a feature level to generate comprehensive feature vectors, the model recognition sub-module is internally provided with the phenotype recognition model, reasoning is carried out by utilizing the comprehensive feature vectors, and a quantification result of the target phenotype parameters is output.
  5. 5. The rapid identification system of crop phenotypes based on multimodal sensor fusion according to claim 4, wherein the phenotype identification model is a model based on machine learning or deep learning, which is obtained by training using a training set labeled with multimodal training data and corresponding real phenotype parameter values, and the target phenotype parameter at least comprises one or more of plant height, leaf area index, canopy coverage, biomass estimation value or stress state index.
  6. 6. The rapid identification system of crop phenotype based on multimodal sensor fusion according to claim 5, wherein the mobile carrying platform is a track platform movable along a preset track, an autonomous navigation mobile robot or an unmanned aerial vehicle.
  7. 7. The rapid identification system of crop phenotype based on multi-modal sensor fusion according to claim 1, further comprising an environmental information acquisition module, wherein the environmental information acquisition module is installed on the mobile bearing platform and is used for synchronously acquiring environmental parameters of the crop area to be detected, the environmental parameters at least comprise illumination intensity, environmental temperature and air humidity, and the main control and data processing module is used for storing the environmental parameters and the original phenotype sensing data in a correlated manner and transmitting the environmental parameters and the original phenotype sensing data to the data analysis module so as to be used as a reference or compensation factor when the data analysis module performs phenotype identification analysis.
  8. 8. The rapid identification system of claim 7, wherein the visual presentation provided by the result output module comprises at least a digital list, a two-dimensional profile, a three-dimensional thermodynamic diagram, or an analytical report containing the target phenotypic parameter.
  9. 9. The rapid identification method for the crop phenotype based on the multi-modal sensor fusion is characterized by comprising the following steps of: Step one, system initialization and path planning, namely starting a mobile bearing platform, a multi-mode data acquisition module and a main control and data processing module, and planning a scanning moving path of a coverage area for the mobile bearing platform according to boundary information of a crop area to be tested; Controlling the mobile bearing platform to move along a planned scanning moving path, synchronously triggering the visible light imaging unit, the multispectral imaging unit and the three-dimensional sensing unit, scanning the passed crops at a preset sampling frequency, and synchronously acquiring a visible light image sequence, a multispectral image sequence and a three-dimensional point cloud data sequence to jointly form original phenotype sensing data; Preprocessing and space-time alignment of multisource data, namely preprocessing the acquired original phenotype sensing data in a main control and data processing module, wherein the preprocessing comprises the steps of removing noise and correcting distortion of image data, filtering outliers of point cloud data, and performing time stamp alignment and space coordinate registration on data representing the same spatial position or the same crop from different sensors to form a space-time aligned multisystem data set; Extracting multi-dimensional characteristics, namely extracting characteristics of different dimensions from a multi-mode dataset aligned in time and space respectively in a data analysis module, extracting visual characteristics of color, texture and contour form from a visible light image, extracting spectral reflectivity, vegetation index and spectral curve characteristics under different wave bands from a multi-spectral image, extracting morphological structure characteristics of height, volume, surface area and three-dimensional shape from three-dimensional point cloud data; Fifthly, feature fusion and phenotype parameter identification, namely fusing the extracted visual features, spectrum features and morphological structure features to generate a comprehensive feature description capable of comprehensively representing the state of crops; inputting the comprehensive feature description into a pre-trained phenotype identification model, calculating and reasoning by the model, identifying and outputting quantitative estimated values of one or more target phenotype parameters; And step six, integrating and outputting the results, namely integrating quantitative estimated values of the target phenotype parameters according to a preset format in a result output module, and locally displaying, storing or transmitting the integrated results to a remote terminal in the form of a visual chart, a data file or an analysis report according to the requirements of a user.
  10. 10. The rapid identification method of crop phenotypes based on multi-modal sensor fusion according to claim 9, wherein the rapid identification method further comprises the step of synchronously acquiring environmental parameters through an environmental information acquisition module in the second step, wherein the environmental parameters are associated and synchronized with original phenotype sensing data in the third step, and the associated environmental parameters are used as one of auxiliary inputs of a phenotype identification model in the fifth step for carrying out environmental factor correction on phenotype identification results.

Description

Multi-modal sensor fusion-based crop phenotype rapid identification system and method Technical Field The invention relates to the technical field of crop identification, in particular to a system and a method for quickly identifying crop phenotypes based on multi-modal sensor fusion. Background In the fields of precise agriculture and crop breeding, it is important to quickly and accurately acquire crop phenotype information (such as plant height, leaf area, biomass, stress state and the like). The traditional crop phenotype identification mainly relies on manual field investigation and single physiological index measurement, and has the problems of low efficiency, strong subjectivity, single data dimension, easiness in damaging plants and the like. In the prior art, although an automatic device for performing image recognition by adopting a single type of sensor (such as a visible light camera) is presented, multi-dimensional information reflecting the growth condition of crops, such as morphology, physiology, biochemistry and the like, is difficult to comprehensively and synchronously capture, and has limited recognition accuracy and robustness, and particularly in a complex field environment, the device is easily interfered by factors such as illumination change, shielding and the like. Therefore, there is a need for a crop phenotype identification technology and system that can efficiently, nondestructively, and multi-dimensionally integrate information to overcome the limitations of the prior art and meet the requirements of modern agriculture for high throughput phenotypic analysis. Disclosure of Invention Therefore, the invention provides a system and a method for quickly identifying the phenotype of crops based on multi-mode sensor fusion, which are used for solving the problem that the prior art is easily interfered by factors such as illumination change, shielding and the like. In order to achieve the above object, the present invention provides the following technical solutions: A crop phenotype rapid identification system based on multi-mode sensor fusion comprises a mobile bearing platform, a multi-mode data acquisition module, a main control and data processing module, a wireless communication module, a data analysis module and a result output module; The movable bearing platform is used for carrying the multi-mode data acquisition module to move above or beside a crop area to be detected; The multi-mode data acquisition module is fixedly arranged on the mobile bearing platform and is used for synchronously acquiring original phenotype sensing data of the crop area to be detected, and the multi-mode data acquisition module at least comprises a visible light imaging unit, a multi-spectrum imaging unit and a three-dimensional sensing unit; The main control and data processing module is arranged on the mobile bearing platform or the remote server and is used for cooperatively controlling the working states of the mobile bearing platform and the multi-mode data acquisition module and carrying out preliminary preprocessing and caching on the acquired original phenotype sensing data; the wireless communication module is in communication connection with the main control and data processing module and is used for realizing data interaction and instruction transmission between the modules and between the system and the remote monitoring terminal; The data analysis module is deployed on a local embedded device or a remote cloud server, receives the multi-mode data preprocessed by the main control and data processing module, performs feature extraction and fusion analysis on the multi-mode data, and identifies and calculates one or more target phenotype parameters of crops based on a preset or trained phenotype identification model; the result output module is in communication connection with the data analysis module and is used for visually displaying, storing or transmitting the identified target phenotype parameters to a designated terminal. Preferably, the visible light imaging unit is a high-resolution RGB camera for acquiring morphological structure and color texture information of crops, the multispectral imaging unit is a narrow-band multispectral camera comprising at least three different central wave bands for acquiring spectral reflection information of the crops in visible light and near infrared wave bands, and the three-dimensional sensing unit is a depth camera or a laser radar based on a time-of-flight principle or a structured light principle for acquiring three-dimensional point cloud data of the crops to reconstruct a spatial structure of the crops in the system. Preferably, the primary preprocessing of the original phenotype sensing data by the main control and data processing module at least comprises the steps of carrying out color correction, denoising and geometric registration on the image data from the visible light imaging unit and the multispectral imaging unit, and carrying