Search

CN-121982527-A - Intelligent monitoring and evaluating method and system for growth state of dryopteris crassifolia

CN121982527ACN 121982527 ACN121982527 ACN 121982527ACN-121982527-A

Abstract

The invention provides an intelligent monitoring and evaluating method and system for the growth state of a crude pedunculate water fern, belonging to the technical field of computer vision, comprising the steps of inputting a crude pedunculate water fern image into an improved UNet3+ model; the improved UNet < 3+ > model comprises an encoder, a dual-path context attention module and a decoder, wherein the encoder is used for extracting features of an image to obtain a feature map, the dual-path context attention module is used for respectively generating a global key-value vector and a local key-value vector of the feature map, generating a fusion feature map based on the global key-value vector, the local key-value vector and an attention mechanism, the decoder is used for generating an edge segmentation result of the rough pedunculate fern based on the fusion feature map, and the growth stage of the rough pedunculate fern is determined based on the edge segmentation result. According to the invention, by introducing the dual-path context attention module into the UNet3+ model, a more robust and accurate edge segmentation result can be generated, and the judgment accuracy of the growth stage of the rough peduncles is improved.

Inventors

  • YE XI
  • WANG DINGKUN
  • HU HUILI
  • XIANG QING

Assignees

  • 江汉大学

Dates

Publication Date
20260505
Application Date
20260120

Claims (10)

  1. 1. An intelligent monitoring and evaluating method for the growth state of a thick-pedunculate water fern is characterized by comprising the following steps: Inputting image data of the rough peduncles into an improved unet3+ model, wherein the improved unet3+ model comprises an encoder, a dual-path context attention module and a decoder; Extracting the characteristics of the image data through the encoder to obtain a characteristic diagram; Generating a global key-value vector representing global context information of the feature map and a local key-value vector representing local detail information of the feature map through the dual-path context attention module respectively, and generating a fusion feature map based on the global key-value vector, the local key-value vector and an attention mechanism; Generating an edge segmentation result of the rough peduncles based on the fusion feature map through the decoder; and determining the growth stage of the rough peduncles based on the edge segmentation result.
  2. 2. The intelligent monitoring and evaluating method for the growth state of the rough pedunculate water fern according to claim 1, wherein the dual-path context attention module comprises an adaptive global pooling module, a local sliding window module and a fusion module, wherein the generating, by the dual-path context attention module, a global key-value vector representing global context information of the feature map and a local key-value vector representing local detail information of the feature map, respectively, and generating a fusion feature map based on the global key-value vector, the local key-value vector and an attention mechanism comprises: Dividing the feature map into a plurality of non-overlapping blocks through the self-adaptive global pooling module, generating a key matrix and a value matrix corresponding to each block, selecting one position from each block, generating a query vector of the position, performing intra-block attention calculation based on the query vector, the key matrix and the value matrix corresponding to each block, and linearly mapping a calculation result to obtain a global key vector and a global value vector corresponding to each block; Generating a plurality of sliding windows by the local sliding window module by taking each pixel point in the feature map as a center, and mapping feature vectors corresponding to each pixel in the sliding windows into local key vectors and local value vectors; And generating a fusion feature map based on the global key vector, the global value vector, the local key vector and the local value vector through the fusion module.
  3. 3. The intelligent monitoring and evaluating method for the growth state of the thick-pedunculate water fern according to claim 2, wherein the slicing the feature map into a plurality of non-overlapping blocks by the adaptive global pooling module comprises: And dividing the feature map into a plurality of non-overlapping blocks based on dividing scale parameters by the self-adaptive global pooling module, wherein the dividing scale parameters are dynamically adjusted based on the size of the feature map and the network level of the dual-path context attention module in the improved unet3+ model.
  4. 4. The intelligent monitoring and evaluating method for the growth state of the thick-pedunculate water fern according to claim 2, wherein the selecting a position from each block and generating a global query vector of the position comprises: randomly selecting a position from each block and generating a global query vector of the position.
  5. 5. The method for intelligently monitoring and evaluating the growth state of the thick-pedunculate water fern according to claim 2, wherein the generating, by the fusion module, a fusion feature map based on the global key vector, the global value vector, the local key vector and the local value vector comprises: Determining a key space and a value space of each pixel in the feature map, wherein the key space is spliced by a global key vector corresponding to the pixel and a local key vector set, and the value space is spliced by a global value vector corresponding to the pixel and a local value vector set; Performing attention calculation based on the query vector and the key space of each pixel to obtain a mixed attention weight; And generating a fusion feature map based on the weighted summation result of the mixed attention weight and the value space.
  6. 6. The intelligent monitoring and evaluating method for the growth state of the thick-stalk water fern according to claim 5, wherein the performing attention calculation based on the query vector and the key space of each pixel to obtain the mixed attention weight comprises: the mixed attention weight is calculated by the following expression: In the formula, Representing a query vector mapped by the ith pixel; a key space representing an i-th pixel; T represents a transpose; representing the dimensions of the key vector; representing relative position offset vectors acting on local key vectors, relative according to local key vectors Is determined by a two-dimensional spatial displacement of the transducer.
  7. 7. The intelligent monitoring and evaluating method for the growth state of the thick-stalk water fern according to claim 1, wherein the determining the growth stage of the thick-stalk water fern based on the edge segmentation result comprises: extracting growth key indexes representing growth area, health condition and morphological complexity from the edge segmentation result; Calculating a growth state index based on the growth key index; determining a growth stage of the dryopteris crassifolia based on the growth state index.
  8. 8. The intelligent monitoring and evaluating method for the growth state of the rough peduncles according to claim 7, wherein the growth key indexes comprise canopy coverage, canopy diameter, greenness index, fractal dimension, boundary density and shape factor.
  9. 9. The intelligent monitoring and evaluating method for the growth state of the thick-pedunculate in the process of claim 7, wherein the calculating the growth state index based on the growth key index comprises: Analyzing the relevance among the growth key indexes, extracting main components affecting the growth state of the crude pedunculate water fern by adopting a main component analysis method, and determining the weight of each index based on the main components; And obtaining the growth state index by linearly fusing each growth key index with the corresponding weight.
  10. 10. An intelligent monitoring and evaluating system for the growth state of a thick peduncle water fern, which is characterized by comprising: The image input module is used for inputting the image data of the rough peduncles into an improved UNet3+ model, wherein the improved UNet3+ model comprises an encoder, a dual-path context attention module and a decoder; The first feature extraction module is used for carrying out feature extraction on the image data through the encoder to obtain a feature map; The second feature extraction module is used for respectively generating a global key-value vector representing global context information of the feature map and a local key-value vector representing local detail information of the feature map through the dual-path context attention module, and generating a fusion feature map based on the global key-value vector, the local key-value vector and an attention mechanism; the segmentation result determining module is used for generating an edge segmentation result of the rough peduncles based on the fusion feature map through the decoder; And the growth stage determining module is used for determining the growth stage of the thick-pedunculate water fern based on the edge segmentation result.

Description

Intelligent monitoring and evaluating method and system for growth state of dryopteris crassifolia Technical Field The invention belongs to the technical field of computer vision, and particularly relates to an intelligent monitoring and evaluating method and system for the growth state of a thick-pedunculate water fern. Background Currently, aquatic plant growth status monitoring is mainly dependent on manual field investigation and traditional image processing methods. Manual monitoring, while intuitive, is time and labor consuming. And it is difficult to achieve large-scale, high-frequency data acquisition. The traditional image processing technology such as threshold segmentation, edge detection, region growth and the like achieves a certain effect in a simple image scene, but still has the problems of low image segmentation precision and insufficient robustness in the face of a complex natural environment. For example, when a conventional unet3+ model is used for carrying out the aquatic plant edge segmentation task, as the number of coding layers increases, the spatial resolution of a feature map generated by an encoder gradually decreases, the number of channels gradually increases, and the model can gradually learn higher and higher semantic features. However, by the deepest layer of the encoder, the spatial dimension of the feature map has been reduced to a small extent, which makes it possible that the convolution layer cannot effectively capture long-range dependencies in the image, resulting in insufficient final segmentation accuracy. Disclosure of Invention Aiming at the problems existing in the prior art, the invention provides an intelligent monitoring and evaluating method for the growth state of the rough peduncles, which comprises the following steps: Inputting image data of the rough peduncles into an improved unet3+ model, wherein the improved unet3+ model comprises an encoder, a dual-path context attention module and a decoder; Extracting the characteristics of the image data through the encoder to obtain a characteristic diagram; Generating a global key-value vector representing global context information of the feature map and a local key-value vector representing local detail information of the feature map through the dual-path context attention module respectively, and generating a fusion feature map based on the global key-value vector, the local key-value vector and an attention mechanism; Generating an edge segmentation result of the rough peduncles based on the fusion feature map through the decoder; and determining the growth stage of the rough peduncles based on the edge segmentation result. In some embodiments, the dual-path context attention module comprises an adaptive global pooling module, a local sliding window module and a fusion module, wherein the generating, by the dual-path context attention module, a global key-value vector representing global context information of the feature map and a local key-value vector representing local detail information of the feature map, and generating a fusion feature map based on the global key-value vector, the local key-value vector and an attention mechanism, respectively, comprises: Dividing the feature map into a plurality of non-overlapping blocks through the self-adaptive global pooling module, generating a key matrix and a value matrix corresponding to each block, selecting one position from each block, generating a query vector of the position, performing intra-block attention calculation based on the query vector, the key matrix and the value matrix corresponding to each block, and linearly mapping a calculation result to obtain a global key vector and a global value vector corresponding to each block; Generating a plurality of sliding windows by the local sliding window module by taking each pixel point in the feature map as a center, and mapping feature vectors corresponding to each pixel in the sliding windows into local key vectors and local value vectors; And generating a fusion feature map based on the global key vector, the global value vector, the local key vector and the local value vector through the fusion module. In some implementations, the splitting, by the adaptive global pooling module, the feature map into a plurality of non-overlapping blocks includes: And dividing the feature map into a plurality of non-overlapping blocks based on dividing scale parameters by the self-adaptive global pooling module, wherein the dividing scale parameters are dynamically adjusted based on the size of the feature map and the network level of the dual-path context attention module in the improved unet3+ model. In some embodiments, the selecting a location from among the blocks and generating a global query vector for the location includes: randomly selecting a position from each block and generating a global query vector of the position. In some implementations, the generating, by the fusion module, a fusion feature map base