Search

CN-121982071-A - Unmanned aerial vehicle aerial survey image real-time interpretation and ground object labeling method and system

CN121982071ACN 121982071 ACN121982071 ACN 121982071ACN-121982071-A

Abstract

The invention discloses a method and a system for real-time interpretation and ground object labeling of aerial survey images of an unmanned aerial vehicle, which relate to the field of real-time interpretation and ground object labeling and comprise an acquisition positioning module, a visual interference module, a space-time evolution module and a judging and processing module, wherein a pose is judged to calculate a bumpy event and a time window in which the bumpy event occurs by combining clustering and association analysis of characteristic curves through real-time acquisition and multi-dimensional positioning calculation, a causal interference domain which causes the bumpy is identified by analyzing a dense motion vector field and a characteristic loss region among images, the causal interference domain is reversely projected to a physical space, a risk analysis region is determined, multiple risk factors in the region are comprehensively calculated, and a drift risk space-time prediction image is generated; and performing space-time alignment and unified mapping, decoding the comprehensive labeling risk indexes of each grid, and starting a hierarchical response mechanism to obtain final effectiveness judgment and processing of the feature labeling, thereby fundamentally guaranteeing the space reliability and time sequence usability of the labeling result.

Inventors

  • Zhao Caiquan

Assignees

  • 云南全测精达科技有限公司

Dates

Publication Date
20260505
Application Date
20260408

Claims (13)

  1. 1. The unmanned aerial vehicle aerial survey image real-time interpretation and ground object labeling method is characterized by comprising the following steps of: Acquiring and analyzing multidimensional positioning and resolving internal data flow in real time, identifying track drift points by using a dynamic baseline model, and judging pose resolving bump events and time windows of occurrence of the bump events by combining clustering and association analysis of characteristic curves; Identifying a bumpy causal interference domain by analyzing a dense motion vector field and a feature loss region between images by utilizing the determined bumpy event and a time window thereof, and generating a geospatial distribution map of a visual interference source; determining a risk analysis area based on the geospatial distribution map, and synthesizing risk factors occupied by basic texture deficiency, illumination change and dynamic targets to generate a drift risk space-time evolution prediction map; And carrying out space-time alignment and unified mapping on the geospatial distribution diagram of the visual interference source and the drift risk space-time evolution prediction diagram, and decoding the comprehensive labeling risk indexes of each grid to obtain final effectiveness judgment and treatment of the ground object labeling.
  2. 2. The method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to claim 1, wherein the step of determining the pose resolving bump event and the time window in which the bump event occurs is as follows: Constructing a first visual feature point distribution entropy curve, a second filtering state covariance curve and a third scale factor curve based on multidimensional positioning and resolving internal data streams; And obtaining a discrete drift time point set based on single-source nominal track drift point analysis, and defining a drift analysis window through clustering.
  3. 3. The method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to claim 2, wherein the step of determining the pose resolving bump event and the time window in which the bump event occurs further comprises: Identifying a first causal instability characteristic relation and a second symptomatic resonance characteristic relation in a drift analysis window to judge a pose resolving bump event, and recording the starting time and the duration of the pose resolving bump event, wherein the starting time is the moment when a significant local minimum appears in a first visual characteristic point distribution entropy curve, and the duration of the moment is the time from a starting time point until the value of a second filtering state covariance curve falls back from the significant local maximum to the value below a determinant value upper limit threshold of a fusion filtering state covariance submatrix.
  4. 4. The method for real-time interpretation and labeling of aerial survey images of unmanned aerial vehicle according to claim 2, wherein the step of obtaining the single-source nominal trajectory drift point is as follows: When the unmanned aerial vehicle executes the aerial survey task, acquiring and synchronizing multidimensional positioning and resolving internal data streams in real time, wherein the internal data streams comprise visual odometer scale factors, determinant values fusing a filtering state covariance submatrix and effective visual feature point distribution entropy; And respectively establishing a dynamic baseline model based on a sliding window based on multidimensional positioning and resolving internal data flow, and identifying outliers exceeding a preset threshold value to obtain a single-source nominal track drift point.
  5. 5. A method of unmanned aerial vehicle aerial survey image real-time interpretation and ground object labeling as claimed in claim 3, wherein the step of identifying first causal instability characteristic relationships and second symptomatic resonance characteristic relationships is as follows: detecting a significant local minimum value on the first visual feature point distribution entropy curve, and simultaneously detecting a significant local maximum value on the second filtering state covariance curve, and judging as identifying a first causal instability characteristic relation; and taking the moment of the detected significant local maximum value in the first causal instability characteristic relation as a center, calculating the local variance of all data points of the third scale factor curve in a time window with fixed length, and judging the third scale factor curve as a second symptom resonance characteristic relation.
  6. 6. The method for real-time interpretation and labeling of aerial survey images of unmanned aerial vehicle according to claim 1, wherein the step of obtaining the geospatial distribution map of the visual disturbance source comprises the following steps: Performing full spectrum filtering on images in a high frame rate image sequence before and after the occurrence time of an event to generate a group of filtering responses, obtaining a characteristic texture intensity energy map based on response aggregation calculation, and performing low-threshold segmentation and morphological optimization processing on the characteristic texture intensity energy map to obtain a full spectrum filtering response valley region; when the spatial overlapping rate of a non-dominant motion manifold region and a full spectrum filtering response valley region is higher than a preset causal association threshold value, confirming the union as a causal attributive interference domain; The method comprises the steps of obtaining the interference type and the influence intensity of a causal attribution interference domain, and back projecting and aggregating the causal attribution interference domain from an image space into a geographic space according to the pose of the unmanned aerial vehicle so as to obtain a geographic space distribution diagram of a visual interference source.
  7. 7. The method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to claim 6, wherein the step of obtaining the non-dominant motion manifold region comprises the steps of: determining a problem time window by utilizing the determined pose to solve the bump event so as to obtain a high-frame-rate image sequence before and after the event occurrence time; and calculating a dense motion vector field between two continuous frames of images in the high frame rate image sequence before and after the occurrence time of the event, and performing motion mode decoupling and significance discrimination on the dense motion vector field to obtain a non-dominant motion manifold region.
  8. 8. The method for real-time interpretation and labeling of aerial survey images of unmanned aerial vehicle according to claim 1, wherein the step of obtaining the drift risk space-time evolution prediction graph is as follows: And constructing a three-dimensional view field cone based on a preset task file, and carrying out intersection calculation of a three-dimensional space on the three-dimensional view field cone and a three-dimensional digital surface model of the risk quantitative analysis area to obtain the view field footprint of the unmanned aerial vehicle.
  9. 9. The method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to claim 8, wherein the step of obtaining the drift risk space-time evolution prediction graph further comprises: after normalizing the illumination change risk index and the dynamic target interference risk index, carrying out weighted fusion with the basic texture starvation index to calculate the total risk potential of each spatial position at different moments; Based on a preset task file, summing the total risk potential of all positions falling in the footprint range of the field of view of the unmanned aerial vehicle at each moment to obtain an instantaneous risk score at the moment, and arranging the instantaneous risk scores according to time sequence to obtain a drift risk space-time evolution prediction graph.
  10. 10. The method for real-time interpretation and labeling of aerial survey images of unmanned aerial vehicle according to claim 9, wherein the step of obtaining the dynamic target interference risk index is as follows: Determining a risk quantification analysis area based on a geospatial profile of a visual disturbance source; And carrying out gridding analysis on the risk quantization analysis area, calculating the corner density of each grid unit, and carrying out normalization and reverse processing on the corner density to obtain a basic texture starvation index.
  11. 11. The method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to claim 10, wherein the step of obtaining the dynamic target interference risk index further comprises: generating a binary shadow map based on a time sequence sun position and a three-dimensional model, and obtaining an illumination change risk index by calculating the time gradient of the binary shadow map; And establishing a probabilistic dynamic target occupation grid model based on the historical data, and predicting the probability and the speed expected value of each grid occupied by the target so as to calculate and obtain the dynamic target interference risk index.
  12. 12. The method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to claim 1, wherein the final validity determination and processing of the labeling of the ground objects comprises the following steps: Carrying out space-time alignment and unified mapping on a geospatial distribution diagram of a visual interference source and a drift risk space-time evolution prediction diagram to obtain a multi-layer defect information diagram; Constructing a defect coupling enhancement characterization vector based on the multi-layer defect information graph, and obtaining a comprehensive labeling risk index through risk decoding by the defect coupling enhancement characterization vector; and performing risk classification response and pose correction closed loop based on the comprehensive labeling risk index so as to perform final effectiveness judgment and treatment on the ground object labeling.
  13. 13. A system for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle, for implementing the method for real-time interpretation and labeling of aerial survey images of an unmanned aerial vehicle according to any one of claims 1 to 12, comprising: the acquisition and positioning module is used for acquiring and analyzing multidimensional positioning and resolving an internal data stream in real time, identifying a track drift point by utilizing a dynamic baseline model, and judging a pose resolving bump event and a time window of occurrence of the bump event by combining clustering and association analysis of characteristic curves; The visual disturbance module is used for identifying a bump cause and effect disturbance domain by analyzing a dense motion vector field and a feature loss region between images by utilizing the judged bump event and a time window thereof, and generating a geospatial distribution diagram of a visual disturbance source; the space-time evolution module is used for determining a risk analysis area based on the geographic space distribution diagram, synthesizing risk factors occupied by basic texture deficiency, illumination change and dynamic targets, and generating a drift risk space-time evolution prediction diagram; And the judging and processing module is used for carrying out space-time alignment and unified mapping on the geospatial distribution diagram of the visual interference source and the drift risk space-time evolution prediction diagram, and decoding the comprehensive labeling risk indexes of each grid so as to obtain final effectiveness judgment and processing of the ground object labeling.

Description

Unmanned aerial vehicle aerial survey image real-time interpretation and ground object labeling method and system Technical Field The invention relates to the field of real-time interpretation and ground object labeling, in particular to a method and a system for real-time interpretation and ground object labeling of unmanned aerial vehicle aerial survey images. Background In the prior art, unmanned aerial vehicles are widely applied to long-time hovering or low-speed cruising tasks of specific areas (such as building sites, agricultural plots and disaster sites) by virtue of high maneuverability and flexibility of unmanned aerial vehicles so as to execute real-time video monitoring, ground feature identification and geographic information system labeling. In order to achieve accurate positioning on consumer or industrial unmanned aerial vehicles that do not rely on expensive real-time dynamic differencing modules, the visual inertial odometer VIO technology is the dominant solution. However, the reliability of the VIO algorithm is highly dependent on its core assumptions, i.e., statics and texture richness of the observed scene. In practical applications, unmanned aerial vehicles often face visually challenging environments, such as flying over water surfaces or solid walls lacking texture, encountering large rapidly moving clouds and shadows, or large dynamic objects in the scene. These conditions can disrupt the assumption of the VIO algorithm, resulting in massive loss or mismatching of visual feature points, which in turn can lead to jolt or failure of the position-resolving system, ultimately manifested as abrupt jumps or continuous drift in unmanned aerial vehicle pose estimation. The drift can be accumulated to more than a few meters in a short period of minutes, so that the ground feature label seen by an operator on a screen generates serious position deviation, namely a label drift problem, on a corresponding geographic map, and the space credibility and time sequence usability of real-time interpretation and label tasks are greatly reduced. In order to solve the problem of labeling drift, the prior art provides a preliminary coping thought. One common approach is to simply monitor the number of feature points output by the VIO system, and consider the positioning unreliable when the number is below some static threshold. The judging mode of the single index is too coarse, normal characteristic point fluctuation and systematic failure symptoms cannot be distinguished, and a large number of false positives and false negatives are easy to generate. Another approach is to try to verify the spatial position consistency of the current feature with the historical annotated feature, but this is a passive, hysteretic check, which also loses benchmark when the overall position has drifted. More importantly, these methods generally lack the ability to accurately diagnose, root cause trace back, and predict prospective risk of locating failure events. They cannot immediately initiate a reliable pose compensation strategy to "fill in" the data gaps during unreliable visual signals when failure occurs, resulting in labeling operations having to be interrupted or large amounts of invalid data being generated. Therefore, there is a need for a method and system that does not rely on expensive hardware upgrades, through algorithm-level optimization, can perform depth state monitoring, accurately diagnose bump events, predict future risk, and initiate hierarchical response and pose compensation closed loops for a VIO system to fundamentally address pain points in professional mapping and long-term monitoring applications for consumer unmanned aerial vehicles. In order to solve the above-mentioned defect, a technical scheme is provided. Disclosure of Invention The present invention has been made in order to solve the technical problems set forth in the background art described above. The invention provides a method and a system for interpreting aerial survey images of an unmanned aerial vehicle in real time and labeling ground objects. The invention aims at realizing the technical scheme that the unmanned aerial vehicle aerial survey image real-time interpretation and ground object labeling method comprises the following steps: Acquiring and analyzing multidimensional positioning and resolving internal data flow in real time, identifying track drift points by using a dynamic baseline model, and judging pose resolving bump events and time windows of occurrence of the bump events by combining clustering and association analysis of characteristic curves; Identifying a bumpy causal interference domain by analyzing a dense motion vector field and a feature loss region between images by utilizing the determined bumpy event and a time window thereof, and generating a geospatial distribution map of a visual interference source; determining a risk analysis area based on the geospatial distribution map, and synthesizing risk