Search

CN-117474885-B - Remote sensing image parallax change processing method for unmanned aerial vehicle garbage scattering area detection

CN117474885BCN 117474885 BCN117474885 BCN 117474885BCN-117474885-B

Abstract

The invention relates to the technical field of computer vision, in particular to a remote sensing image parallax change processing method, device and storage medium for detecting a garbage scattering area of an unmanned aerial vehicle. The method comprises the steps of carrying out downsampling on at least two unmanned aerial vehicle remote sensing images acquired from a garbage scattering area by an unmanned aerial vehicle based on a feature pyramid module to obtain multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image, carrying out feature alignment mapping on the multi-level feature atlas corresponding to the two matched unmanned aerial vehicle remote sensing images based on an optical flow alignment module to obtain a pair of unmanned aerial vehicle remote sensing image combinations after feature alignment, and carrying out feature difference mapping processing on the unmanned aerial vehicle remote sensing image combinations based on a progressive difference feature fusion and detection module to eliminate repeated features in the unmanned aerial vehicle remote sensing image combinations so as to obtain multi-level feature difference images subjected to parallax change processing. The method aims at solving the problem of how to perform parallax change processing on the unmanned aerial vehicle remote sensing image with the visual angle difference.

Inventors

  • YANG YANG
  • DONG YAXIN
  • ZHAO PAN
  • BAI HAICHENG
  • XING LIN

Assignees

  • 云南师范大学

Dates

Publication Date
20260512
Application Date
20231108

Claims (8)

  1. 1. The remote sensing image parallax change processing method for detecting the unmanned aerial vehicle garbage scattering area is characterized by being applied to an unmanned aerial vehicle parallax change processing system, wherein the unmanned aerial vehicle parallax change processing system is provided with an end-to-end change detection network model, and the remote sensing image parallax change processing method for detecting the unmanned aerial vehicle garbage scattering area comprises the following steps: based on a feature pyramid module, downsampling at least two unmanned aerial vehicle remote sensing images acquired by an unmanned aerial vehicle from a garbage scattering area to obtain a multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image; Based on an optical flow alignment module, performing feature alignment mapping on the multi-level feature atlas corresponding to the two matched unmanned aerial vehicle remote sensing images to obtain a pair of unmanned aerial vehicle remote sensing image combinations with aligned features, wherein the optical flow alignment module comprises an optical flow estimator for optical flow estimation and a warping layer for performing feature alignment between the multi-level feature atlas corresponding to the two matched unmanned aerial vehicle remote sensing images; The two matched unmanned aerial vehicle remote sensing images comprise a first unmanned aerial vehicle remote sensing image and a second unmanned aerial vehicle remote sensing image, the multi-level feature atlas comprises a first multi-level feature atlas corresponding to the first unmanned aerial vehicle remote sensing image and a second multi-level feature atlas corresponding to the second unmanned aerial vehicle remote sensing image, the steps of combining the two matched unmanned aerial vehicle remote sensing images by using the multi-level feature atlas corresponding to the two matched unmanned aerial vehicle remote sensing images as feature alignment mapping based on an optical flow alignment module comprise the following steps of: Calculating a local correlation map and a global correlation map between the first multi-level feature atlas and the second multi-level feature atlas; Invoking an optical flow estimator; Determining a global optical flow estimation result of a lowest resolution feature map in the multi-level feature map set based on the global correlation map and the optical flow estimator, and determining a local optical flow estimation result of other feature maps except the lowest resolution feature map in the multi-level feature map set based on the local correlation map and the optical flow estimator, wherein the multi-level feature map set comprises the first multi-level feature map set and the second multi-level feature map set; according to the global optical flow estimation result and the local optical flow estimation result, moving the pixel points in the first multi-level feature image set and the second multi-level feature image set through the warping layer to obtain a coarse alignment first unmanned aerial vehicle remote sensing image and a coarse alignment second unmanned aerial vehicle remote sensing image; Determining the coarse alignment first unmanned aerial vehicle remote sensing image and the coarse alignment second unmanned aerial vehicle remote sensing image as the unmanned aerial vehicle remote sensing image combination; Based on a progressive difference feature fusion and detection module, the unmanned aerial vehicle remote sensing image combination is processed as feature difference mapping so as to eliminate repeated features in the unmanned aerial vehicle remote sensing image combination, and a multi-level feature difference map subjected to parallax change processing is obtained.
  2. 2. The method of claim 1, wherein the feature pyramid module is built based on a VGG16 network module, and the step of downsampling at least two unmanned aerial vehicle remote sensing images acquired by an unmanned aerial vehicle from a garbage scattering area based on the feature pyramid module to obtain multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image comprises: Performing feature extraction on the unmanned aerial vehicle remote sensing image to obtain a plurality of feature images with different layers, and forming the multi-level feature atlas corresponding to the unmanned aerial vehicle remote sensing image; The number of channels of the feature map located at the next layer is twice the number of channels of the feature map located at the adjacent upper layer.
  3. 3. The method of claim 1, wherein the warp layers comprise a first type of warp layer and a second type of warp layer, the steps of moving pixels in the first multi-level feature atlas and the second multi-level feature atlas through the warp layers to obtain a coarse alignment first multi-level feature atlas, and coarse alignment second multi-level feature atlas based on the global optical flow estimation result and the local optical flow estimation result comprise: Twisting the feature map of the second multi-level feature map set through the first warp layer to align the feature map of the second multi-level feature map set with the first multi-level feature map set to obtain the coarse alignment first multi-level feature map set and the coarse alignment second multi-level feature map set, and/or, Using an optical flow estimation result of a previous layer to warp a feature map of a current layer through the second type of warping layer so as to align the feature map of each layer of the second multi-level feature map set with the feature map of a corresponding level in the first multi-level feature map set, thereby obtaining the coarse alignment first multi-level feature map set and the coarse alignment second multi-level feature map set, wherein the optical flow estimation result comprises the global optical flow estimation result and the local optical flow estimation result; The second type of warping layer is applied to the multi-level feature map set, and other layers of feature maps except the bottommost layer of feature map are used.
  4. 4. The method of claim 1, wherein the unmanned aerial vehicle remote sensing image combination comprises a coarse aligned first unmanned aerial vehicle remote sensing image and a coarse aligned second unmanned aerial vehicle remote sensing image, wherein the step of combining the unmanned aerial vehicle remote sensing images based on a progressive difference feature fusion and detection module to perform feature difference mapping processing on the unmanned aerial vehicle remote sensing image combination to eliminate repeated features in the unmanned aerial vehicle remote sensing image combination and obtain a multi-level feature difference map subjected to parallax change processing comprises: determining a characteristic map in the coarse alignment second unmanned aerial vehicle remote sensing image, a characteristic difference map between the characteristic map in the coarse alignment first unmanned aerial vehicle remote sensing image, and calculating a difference map absolute value of the characteristic difference map; performing up-sampling processing on the feature difference image generated in the previous level to obtain an up-sampling feature difference image; And fusing the absolute value of the difference map and the upsampled characteristic difference map through an attention mechanism and multi-layer convolution processing to obtain the multi-layer characteristic difference map.
  5. 5. The method of claim 1, wherein the step of downsampling at least two unmanned aerial vehicle remote sensing images acquired by the unmanned aerial vehicle from the garbage scattering area based on the feature pyramid module to obtain a multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image further comprises: Based on the endpoint error loss function, constraining optical flow error loss between the unmanned aerial vehicle remote sensing image combination after characteristic alignment compared with the unmanned aerial vehicle remote sensing image combination acquired initially; And constraining characteristic errors between deep features of the multi-level feature difference map and the input unmanned aerial vehicle remote sensing image based on a binary cross entropy loss function.
  6. 6. The method of claim 1, wherein the step of downsampling at least two unmanned aerial vehicle remote sensing images acquired by the unmanned aerial vehicle from the garbage scattering area based on the feature pyramid module to obtain a multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image further comprises: Acquiring a remote sensing image training data set, and enhancing the remote sensing image training data set through random overturn to obtain a preprocessed remote sensing image training data set; determining a garbage scattering area in the preprocessed remote sensing image training data set, and performing random affine transformation processing on image pairs in the garbage scattering area to enhance viewpoint differences in the preprocessed remote sensing image training data set; Training a preset neural network in the unmanned aerial vehicle based on the preprocessed remote sensing image training data set after the viewpoint difference is enhanced; The remote sensing image training data set comprises a change map obtained by synthesizing a garbage distribution image with viewpoint difference and a remote sensing image acquired by an unmanned aerial vehicle, and optical flow information corresponding to the garbage distribution image, the remote sensing image and the change map.
  7. 7. The unmanned aerial vehicle parallax change processing system is characterized by comprising a memory, a processor and a remote sensing image parallax change processing program which is stored in the memory and can be run on the processor and used for detecting a unmanned aerial vehicle garbage scattering area, wherein the remote sensing image parallax change processing program for detecting the unmanned aerial vehicle garbage scattering area realizes the steps of the remote sensing image parallax change processing method for detecting the unmanned aerial vehicle garbage scattering area according to any one of claims 1 to 6 when being executed by the processor.
  8. 8. A computer-readable storage medium, wherein a remote sensing image parallax change processing program for unmanned aerial vehicle garbage scattering area detection is stored on the computer-readable storage medium, and the remote sensing image parallax change processing program for unmanned aerial vehicle garbage scattering area detection realizes the steps of the remote sensing image parallax change processing method for unmanned aerial vehicle garbage scattering area detection according to any one of claims 1 to 6 when executed by a processor.

Description

Remote sensing image parallax change processing method for unmanned aerial vehicle garbage scattering area detection Technical Field The invention relates to the technical field of computer vision, in particular to a remote sensing image parallax change processing method for detecting a garbage scattering area of an unmanned aerial vehicle. Background The small unmanned aerial vehicle plays an important role in earth remote sensing observation and has the advantages of cooperative technology, easiness in operation, high-resolution image acquisition, flexibility and the like. When the small unmanned aerial vehicle captures images at the same location but at different times, the acquired images inevitably have a viewing angle difference due to factors such as inaccurate GPS (Global Positioning System) signals, changes in flight attitude, complex wind speeds and directions, and geographical environments. For solving the problem of detecting the change of the image with the visual angle difference, in the related technical scheme, a weak supervision semantic scene change detection model is adopted to detect the change of the image with the visual angle difference. Dividing an image into grid cells, extracting characteristics of the grid cells by using a Convolutional Neural Network (CNN), calculating Euclidean distances among the characteristics of the grid cells, and judging whether the image in a grid area is changed or not according to the Euclidean distances. However, the inventors found in the course of conception and implementation of the present application that when the unmanned aerial vehicle is applied to garbage scattering area detection and the conventional parallax change detection model is used for change detection, the image change having the viewing angle difference can be accurately detected only when the corresponding pixels are divided into grid cells at the same position. In addition, since the capability of processing an image having a viewing angle difference depends on the size of the grid cell, when the size of the grid cell is increased in order to process an image having a large viewing angle difference, there is a case where a change detection area introduces an error, and therefore, the method is difficult to be applied to the image change processing having a large viewing angle difference, and has a defect of insufficient applicability. The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art. Disclosure of Invention The invention mainly aims to provide a remote sensing image parallax change processing method for detecting a garbage scattering area of an unmanned aerial vehicle, and aims to solve the problem of how to perform parallax change processing on remote sensing images of the unmanned aerial vehicle with visual angle differences. In order to achieve the above object, the present invention provides a method for processing parallax change of a remote sensing image for detecting a garbage scattering area of an unmanned aerial vehicle, the method comprising: based on a feature pyramid module, downsampling at least two unmanned aerial vehicle remote sensing images acquired by an unmanned aerial vehicle from a garbage scattering area to obtain a multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image; Based on an optical flow alignment module, performing feature alignment mapping on the multi-level feature atlas corresponding to the two matched unmanned aerial vehicle remote sensing images to obtain a pair of unmanned aerial vehicle remote sensing image combinations with aligned features, wherein the optical flow alignment module comprises an optical flow estimator for optical flow estimation and a warping layer for performing feature alignment between the multi-level feature atlas corresponding to the two matched unmanned aerial vehicle remote sensing images; Based on a progressive difference feature fusion and detection module, the unmanned aerial vehicle remote sensing image combination is processed as feature difference mapping so as to eliminate repeated features in the unmanned aerial vehicle remote sensing image combination, and a multi-level feature difference map subjected to parallax change processing is obtained. Optionally, the feature pyramid module is built based on a VGG16 network module, and the step of downsampling at least two unmanned aerial vehicle remote sensing images acquired by the unmanned aerial vehicle from the garbage scattering area based on the feature pyramid module to obtain a multi-level feature atlas corresponding to each unmanned aerial vehicle remote sensing image includes: Performing feature extraction on the unmanned aerial vehicle remote sensing image to obtain a plurality of feature images with different layers, and forming the multi-level feature atlas co