CN-122023860-A - Crop image complement and enhancement generation method based on antagonism search
Abstract
The invention discloses a crop image complement and enhancement generation method based on resistance search, and belongs to the field of intelligent monitoring and image generation of crop diseases and insect pests. The method comprises the steps of firstly obtaining a crop image dataset and carrying out noise enhancement and normalization preprocessing, secondly constructing a generator search space and a discriminator search space based on an attention mechanism, wherein the generator search space comprises transposed convolution, nearest interpolation and bilinear interpolation operation, the discriminator search space comprises maximum pooling and average pooling operation, a multiscale feature re-weighting module and an ECA attention mechanism are introduced to realize multiscale feature fusion and inter-channel interaction, then, the counter search strategy is adopted to carry out cooperative training on the generator and the discriminator, search cost is controlled through a part of connection mechanism, a focus loss function and an Adam optimization algorithm are combined to dynamically optimize a network structure, finally, the optimal generator and the discriminator are screened based on structural similarity, and the attention weight is normalized and output. Through the co-evolution of the generator and the discriminator, the high-quality complement and enhancement of the low-resolution crop image are realized, the global and local feature expression is considered, and the stability and the accuracy of the plant disease and insect pest detection model and the adaptability to the complex farmland environment are obviously improved.
Inventors
- LIU CHAO
- Mu Tianhong
- LIU ZHOUZHOU
- LI GUANLIN
- WANG YANYU
- LI HUI
Assignees
- 刘超
Dates
- Publication Date
- 20260512
- Application Date
- 20251108
Claims (10)
- 1. The crop image complement and enhancement generation method based on the resistance search is characterized by comprising the following steps of: acquiring a crop image data set, and carrying out random noise enhancement and normalization on the data set; Constructing a generator search space and a arbiter search space containing an attention mechanism; co-training the generator and the discriminator by adopting an antagonism searching strategy; and selecting the optimal structure of the generator and the discriminator based on the performance evaluation strategy, and outputting the normalized attention weight as a final model parameter.
- 2. The method of claim 1, wherein the generator search space comprises transposed convolution, nearest neighbor interpolation, and bilinear The three upsampling operations are interpolated and only act on the input node edges of the cell.
- 3. The method of claim 1, wherein the arbiter search space comprises max pooling and average pooling operations, and only Acting on the output node edges to enhance the multi-scale discrimination capability of the discriminator.
- 4. The method of claim 1, wherein the generator and the arbiter each introduce an ECA attention mechanism module for Local feature interaction and information weighting among channels are realized.
- 5. The method of claim 1, further comprising constructing a multi-scale feature re-weighting module for different ones of the plurality of features And carrying out weighted fusion on the characteristic diagram of the convolution kernel size, wherein the fusion weight range is 0.25-0.45.
- 6. The method of claim 1, wherein the antagonistic search strategy comprises the steps of: initializing network weights of a generator and a discriminator; The partial connection mechanism is adopted to sample the node input in proportion, and the sampling proportion is 0.5-0.6; executing candidate operation and splicing output node characteristics; calculating a focus loss function, and giving higher weight to the samples difficult to classify; And updating network parameters and dynamically monitoring model performance by adopting an Adam optimizer.
- 7. The method of claim 6, wherein the focus loss function weight coefficient is set to 1.2-1.5, learning rate range From 0.0002 to 0.001, adam optimizer parameters are β=0.8 to 0.9, β=0.999.
- 8. The method of claim 1, wherein the structural similarity of the generator and the arbiter employs cosine similarity or euclidean similarity And calculating the distance, wherein the similarity threshold is set to 0.85-0.9, and the similarity threshold is used for selecting the model combination with the best matching structure.
- 9. The method of claim 1, wherein the generator and arbiter each comprise 5-6 blocks of residuals, each residual The block comprises two convolutions with a convolution kernel size of 3 x 3 and a step size of 1.
- 10. The method according to claim 1, wherein the method is applied to crop pest image complement and enhancement scenes, and can be used for And recovering detailed information of crop leaves under the condition of insufficient illumination or low resolution, and improving the accuracy and stability of the plant diseases and insect pests detection model.
Description
Crop image complement and enhancement generation method based on antagonism search Technical Field The invention relates to the field of intelligent monitoring of crop diseases and insect pests, in particular to a crop image complement and enhancement generation method based on antagonism search. Background Along with the acceleration of the modern agricultural process, intelligent detection, prevention and control technologies of crop diseases and insect pests are increasingly paid attention to. However, in the actual application scene, due to factors such as unstable illumination conditions, limitation of shooting equipment, complex field environment and the like, the obtained crop image often has the problems of poor quality, lack of details and the like, and the subsequent pest and disease identification and analysis work is seriously affected. Image complement and enhancement technology has become a key element in the development of agriculture intelligence as an important means for solving the problem. Currently, image processing methods based on deep learning have made a certain progress in the field of crop image enhancement. CN112488963a discloses an enhancement method for crop disease data, which adds a class activation diagram attention module into a generator and a discriminator for generating a contrast network, so that the shape and texture of the generated disease leaf image are more realistic. CN116797537a proposes a crop pest detection method that utilizes an attention convolutional neural network and a countering module to improve detection accuracy, and trains the network by generating a set of countering samples. CN119478709a describes a method for lightweight crop image superelevation and disease identification, which combines the generation of an countermeasure network and model distillation technology, and reduces model parameters and computational requirements while maintaining image enhancement performance. In the field of image restoration, CN111047541a proposes an image restoration method based on a wavelet transform attention model, which decomposes a damaged image into multiple frequency subbands by using Discrete Wavelet Transform (DWT), extracts deep information through an attention mechanism, and finally generates a restored image [4] through Inverse Discrete Wavelet Transform (IDWT). In addition, CN113781377a introduces an infrared and visible light image fusion method based on antagonistic semantic guidance and perception, which uses a segmentation network as a discriminator, and enhances the target significance of the fused image by optimizing the fusion process through antagonistic learning. However, the prior art still suffers from the following deficiencies in handling crop image complement and enhancement tasks: The existing crop image processing method mostly adopts a fixed network structure, lacks the capability of adaptively adjusting a network architecture aiming at different scenes, and is difficult to cope with complex and changeable farmland environments and diversified plant diseases and insect pests expression forms. The traditional generation countermeasure network is easy to have the problems of mode collapse, unstable training and the like in the training process, and particularly when high-resolution crop images are processed, the quality fluctuation is large, and the consistency image complement effect is difficult to ensure. When the existing method is used for processing crop images, importance of features with different scales is often ignored, global structures and local details are difficult to be considered simultaneously, and the generated images are insufficient in reduction degree on microscopic features such as lesion textures and the like. The attention mechanism in the prior art is relatively simple in design, lacks optimization aiming at the characteristics of the crop images, and is difficult to effectively capture the difference characteristics between the disease and pest characteristic areas and the healthy areas. In the aspect of network architecture searching, the existing method lacks consideration of structural cooperativity between a generator and a discriminator, and is difficult to realize the optimal image complement effect while maintaining the countermeasure balance. Therefore, there is a need for an image complement and enhancement method that can adaptively optimize a network architecture, stabilize a training process, effectively fuse multi-scale features, and is designed for crop image features, so as to improve the accuracy and reliability of crop pest detection. Disclosure of Invention The invention provides a crop image complement and enhancement generation method based on antagonism search, which aims to solve the problems that the identification accuracy is reduced due to the influence of low resolution when the existing crop disease and pest detection technology is used for processing complex environments, global and l