CN-122024106-A - Intelligent interpretation method and system for unmanned aerial vehicle inspection image
Abstract
The invention discloses an intelligent interpretation method and system for an unmanned aerial vehicle inspection image, and relates to the technical field of intelligent interpretation and airborne data processing of unmanned aerial vehicle inspection images. And updating sampling intervals based on interpretation uncertainty, and generating a repeated shooting control parameter packet based on repeated shooting benefits and resource constraints. And executing repeated shooting and re-interpretation based on the repeated shooting control parameter package, fusing the conventional interpretation result and the repeated shooting interpretation result, and performing closed-loop termination judgment output. The invention uses the uncertainty of interpretation as a unified control quantity, promotes the inspection from unidirectional acquisition and offline interpretation to a closed loop flow of interpretation, decision-making, re-shooting, re-interpretation, fusion and termination, and realizes the directional re-shooting and stable output of high risk defect candidates under the condition of limited resources.
Inventors
- YANG HUA
- WANG YUKAI
- SUN HUAQIANG
- ZHANG XI
- ZHU JINBO
Assignees
- 怡利科技发展有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20260209
Claims (8)
- 1. An intelligent interpretation method for an unmanned aerial vehicle inspection image is characterized by comprising the following steps: Collecting unmanned aerial vehicle inspection images and synchronous metadata, interpreting to obtain interpretation probability distribution and calculating interpretation uncertainty; Updating sampling intervals based on interpretation uncertainty, and generating a repeated shooting control parameter packet based on repeated shooting benefits and resource constraint; and executing repeated shooting and re-interpretation based on the repeated shooting control parameter package, fusing the conventional interpretation result and the repeated shooting interpretation result, and performing closed-loop termination judgment output.
- 2. The intelligent interpretation method of the unmanned aerial vehicle inspection image as claimed in claim 1, wherein the steps of performing unmanned aerial vehicle inspection image and synchronous metadata collection include, The unmanned aerial vehicle camera acquires a patrol image at the current moment, acquires unmanned aerial vehicle synchronization metadata with the same timestamp, comprises pose, flight height, heading, flight speed, cradle head pose, focal length, exposure parameters and link quality indexes, writes the patrol image and the synchronization metadata into a task cache of an unmanned aerial vehicle onboard processor according to a corresponding relation, and records a frame number and acquisition moment for each frame of patrol image.
- 3. The unmanned aerial vehicle inspection image intelligent interpretation method of claim 2, wherein the interpreting to obtain an interpretation probability distribution and calculating an interpretation uncertainty comprises, Inputting the inspection image into a preset interpretation model in an unmanned plane airborne processor to output class probability distribution of defect candidates, wherein the class probability distribution consists of probability values corresponding to each defect class, and outputting component identifiers corresponding to the defect candidates; Calculating the interpretation uncertainty based on the class probability distribution in such a way that probability values of all defect classes are multiplied by logarithmic values respectively and then summed to take negative values, wherein the probability values represent probabilities that defect candidates belong to corresponding defect classes, the defect classes are class indexes in the probability distribution, and logarithms are logarithmic operations of preset base numbers; The interpretation uncertainty is stored in association with the class probability distribution, location parameters, component identification, and synchronization metadata of the corresponding defect candidate as a conventional interpretation result.
- 4. The unmanned aerial vehicle inspection image intelligent interpretation method as claimed in claim 3, wherein the updating the sampling interval based on the interpretation uncertainty comprises, Reading an interpretation uncertainty and a reference uncertainty threshold at the current moment, multiplying the current sampling interval by an index adjustment factor to obtain a candidate sampling interval, wherein the index adjustment factor monotonically decreases along with the increase of the interpretation uncertainty relative to the reference uncertainty threshold, and the index adjustment factor is determined by a preset adjustment coefficient; The candidate sampling interval is defined between the minimum sampling interval and the maximum sampling interval as an updated sampling interval.
- 5. The intelligent interpretation method of unmanned aerial vehicle inspection images as claimed in claim 4, wherein the generation of the repeated shooting control parameter packet based on repeated shooting benefits and resource constraint comprises, Constructing a composite beat action set, wherein each composite beat action in the composite beat action set comprises a height adjustment parameter, a cradle head angle adjustment parameter, a focal length adjustment parameter, a navigational speed adjustment parameter, an exposure parameter, a light supplementing parameter and a continuous beat frame number parameter, and the unadjusted parameters take preset default values; calculating the repeated shooting benefits aiming at each repeated shooting action, wherein the repeated shooting benefits are obtained by synthesizing interpretation uncertainty, risk scores and cost scores according to preset weights, the risk scores are obtained by mapping the category identifiers and the component identifiers of defect candidates, the cost scores are obtained by multiplying time consumption, electric quantity consumption and link occupation with corresponding weights respectively and then summing the multiplied values, and the corresponding weights of items which do not participate in calculation are zero; Carrying out feasibility judgment on the composite beating action based on resource constraint, wherein the resource constraint comprises a residual electric quantity threshold value and a no-fly zone constraint as hard constraint, a wind speed threshold value and a link quality threshold value as additional constraint, setting the composite beating action as non-selectable when any hard constraint is not met, and reducing the composite beating income of the composite beating action when the additional constraint is not met; Selecting the re-shooting action with the biggest re-shooting benefit from the selectable re-shooting actions, comparing the re-shooting action with a re-shooting action triggering threshold, and generating a re-shooting control parameter packet when the re-shooting benefit is not smaller than the re-shooting action triggering threshold, wherein the re-shooting control parameter packet comprises a flight control parameter, a cradle head control parameter, a camera control parameter and a corresponding defect candidate mark, and the camera control parameter comprises an exposure parameter, a focal length adjustment parameter, a light supplementing parameter and a continuous shooting frame number parameter.
- 6. The intelligent interpretation method for the inspection image of the unmanned aerial vehicle as claimed in claim 5, wherein the performing the repeated shooting and interpretation based on the repeated shooting control parameter packet comprises, Issuing a repeated shooting control parameter packet to the unmanned aerial vehicle flight control assembly, the cradle head control assembly and the camera control assembly to execute repeated shooting actions, acquiring repeated shooting inspection images according to continuous shooting frame number parameters in camera control parameters in the execution process, and synchronously acquiring synchronous metadata corresponding to each frame of repeated shooting inspection images; inputting the repeated inspection image into an interpretation model which is the same as the conventional interpretation to obtain repeated interpretation results, and establishing association storage of the repeated interpretation results and the corresponding conventional interpretation results by using defect candidate identifiers.
- 7. The intelligent interpretation method of the unmanned aerial vehicle inspection image as set forth in claim 6, wherein the steps of fusing the conventional interpretation result with the repeated interpretation result and performing closed-loop termination judgment output comprise, Respectively reading class probability distribution and interpretation uncertainty for a conventional interpretation result and a repeated shooting interpretation result corresponding to the same defect candidate mark, determining fusion weight according to the interpretation uncertainty, wherein the fusion weight is reduced along with the increase of the interpretation uncertainty; Weighting and normalizing the probability distribution of each category according to the fusion weight to obtain the probability distribution of the fusion category, and updating and decoding uncertainty based on the probability distribution of the fusion category; Comparing the updated interpretation uncertainty with an uncertainty lower limit threshold, a maximum number of beats threshold and a resource budget threshold, and outputting a defect judgment result corresponding to the fusion type probability distribution and an associated synchronous metadata record when the updated interpretation uncertainty is not larger than the uncertainty lower limit threshold or the maximum number of beats threshold is reached or the resource budget threshold is triggered, or else, continuing to generate a beat control parameter packet.
- 8. An intelligent interpretation system for an unmanned aerial vehicle inspection image, which adopts the intelligent interpretation method for the unmanned aerial vehicle inspection image according to any one of claims 1-7, and is characterized by comprising an acquisition and interpretation uncertainty module, an uncertainty driving re-shooting module and a re-shooting fusion closed-loop output module; the acquisition interpretation uncertainty module is used for acquiring unmanned aerial vehicle inspection images and synchronous metadata, acquiring interpretation probability distribution and calculating interpretation uncertainty; The uncertainty driving repeated shooting module is used for updating sampling intervals based on interpretation uncertainty and generating repeated shooting control parameter packets based on repeated shooting benefits and resource constraint; and the repeated shooting fusion closed-loop output module is used for executing repeated shooting re-interpretation based on the repeated shooting control parameter packet, fusing a conventional interpretation result and a repeated shooting interpretation result and performing closed-loop termination judgment output.
Description
Intelligent interpretation method and system for unmanned aerial vehicle inspection image Technical Field The invention relates to the technical field of unmanned aerial vehicle inspection image intelligent interpretation and airborne data processing, in particular to an unmanned aerial vehicle inspection image intelligent interpretation method and system. Background In recent years, unmanned aerial vehicles are equipped with sensors such as visible light and infrared to develop power lines, traffic infrastructure and industrial device inspection in a normal state. With the development of cradle head stability, RTK positioning and edge computing, image acquisition is carried out by manually shooting and turning to standardized route acquisition, image interpretation is carried out by threshold and characteristic engineering evolution into deep learning detection, segmentation and multi-task recognition, and the processes of airborne preprocessing, link returning, cloud analysis and report generation are formed. Lu Bangjie translation for complex illumination, scale change and shielding conditions, on-line quality control and end cloud cooperation become important directions for improving automatic interpretation capability. The existing unmanned aerial vehicle inspection interpretation method finishes data acquisition by using fixed airlines, fixed sampling intervals and fixed shooting parameters, and the interpretation stage generally only outputs defect types and positions or simple confidence threshold judgment, and lacks quantitative modeling and closed-loop utilization of indeterminate interpretation. When the target scale is too small, backlight jitter or shielding causes divergence of model output distribution, the system is difficult to automatically trigger targeted re-shooting actions, manual re-checking or post-processing is often relied on, and the re-shooting parameters lack uniform decision basis. In addition, the existing scheme rarely brings constraints such as electric quantity, no-fly zones, wind speed, link quality and the like into the repeated shooting selection explicitly, and the feedback and the local cache are difficult to balance under bandwidth fluctuation, so that the repeated shooting is either blind or abandoned under the condition of limited resources, and task completion and risk control are difficult to achieve. Furthermore, the sampling interval is typically not adaptively adjusted with interpretation quality, and critical areas may be undersampled. The conventional interpretation results and the repeated interpretation results are mostly simply covered or manually selected, and a weight fusion and termination judging mechanism based on uncertainty is absent, so that repeatable interpretation, decision making, repeated shooting, re-interpretation and fusion closed-loop flows cannot be formed, and therefore low-omission judgment and traceable output are difficult to realize stably. Disclosure of Invention The present invention has been made in view of the above-described problems. Therefore, the technical problem solved by the invention is that the existing unmanned aerial vehicle inspection image interpretation method has fixed sampling interval and shooting parameters, does not form repeated shooting trigger and repeated shooting action selection based on interpretation uncertainty, does not incorporate resource constraints such as electric quantity, no-fly zone, wind speed, link quality and the like into repeated shooting decision and closed-loop control, and has the problems of how to generate repeated shooting control parameter packages, and how to integrate conventional interpretation results and repeated shooting interpretation results to finish closed-loop termination judgment output. In order to solve the technical problems, the invention provides the technical scheme that the unmanned aerial vehicle inspection image intelligent interpretation method comprises the steps of carrying out unmanned aerial vehicle inspection image and synchronous metadata acquisition, interpreting to obtain interpretation probability distribution and calculating interpretation uncertainty. And updating sampling intervals based on interpretation uncertainty, and generating a repeated shooting control parameter packet based on repeated shooting benefits and resource constraints. And executing repeated shooting and re-interpretation based on the repeated shooting control parameter package, fusing the conventional interpretation result and the repeated shooting interpretation result, and performing closed-loop termination judgment output. The unmanned aerial vehicle inspection image and synchronization metadata acquisition comprises the steps of acquiring an inspection image at the current moment by an unmanned aerial vehicle camera, acquiring unmanned aerial vehicle synchronization metadata with the same timestamp, writing the inspection image and the synchronization metadata into a task ca