CN-121384036-B - Autonomous navigation method for inspection of distribution network unmanned aerial vehicle based on GPS information and visual information
Abstract
The invention relates to the technical field of autonomous routing inspection navigation of a distribution network unmanned aerial vehicle, in particular to a distribution network unmanned aerial vehicle routing inspection autonomous navigation method based on GPS information and visual information. According to the invention, an environment model is constructed by fusing GPS information, the earth curvature error is eliminated by adopting Gaussian projection, the signal noise is restrained by moving average filtering, the Thiessen polygon is utilized to quantitatively inspect points and obstacle distribution, an optimal route is generated by combining a tornado optimization algorithm, the shortest route without repetition and obstacle avoidance is realized, the obstacle is identified based on YOLOv model, an obstacle avoidance strategy is formulated according to a risk value, the route is dynamically optimized, an optical environment fingerprint library is constructed, the LSTM is combined to predict illumination change, the aperture shutter sensitivity and white balance are adaptively adjusted, the image brightness and definition are optimized, and the inspection efficiency and the data reliability are improved.
Inventors
- HU BING
- WANG SONG
- LIU WEIYAN
- HUANG YUHUI
- QIU ZHENYU
- WANG QINGWEN
- XIAO ZIYANG
Assignees
- 国网江西省电力有限公司信息通信分公司
- 江西科晨洪兴信息技术有限公司
Dates
- Publication Date
- 20260512
- Application Date
- 20251219
Claims (10)
- 1. The autonomous navigation method for the inspection of the distribution network unmanned aerial vehicle based on the GPS information and the visual information is characterized by comprising the following steps: S1, acquiring GPS coordinates of a patrol task point and surrounding obstacles, calling an airborne GPS to acquire coordinates of a flying spot, adopting a sliding average filtering algorithm to smooth a coordinate sequence for denoising, and utilizing Gaussian projection to convert longitude and latitude into plane rectangular coordinates to generate planarization coordinate data; S2, constructing a Delaunay triangle network based on the planarization coordinate data to determine the topological relation of adjacent points, forming Thiessen polygon dividing regions according to the radius of a circumcircle, numbering each region, randomly scrambling indexes through a Fisher-Yates shuffling algorithm, and generating a patrol region index sequence; S3, calculating the sum of Euclidean distances among task points as the route length according to the inspection area index sequence, taking the reciprocal of the sum as an fitness value, sorting the population, dividing the population into a tornado layer, a thunderstorm layer and a storm layer, weighting and updating positions by difference vectors by storm layer individuals, applying random disturbance, correcting the thunderstorm layer according to the tornado layer, realizing continuous solution to discrete index conversion through range mapping and validity restoration, and carrying out iterative optimization to obtain an optimal route index sequence; S4, controlling the unmanned aerial vehicle to fly sequentially according to the optimal route index sequence, collecting forward images in real time to identify the type, the size and the distance of the obstacle, calculating an environmental risk value through a risk assessment formula by combining the identification confidence coefficient, the pixel density and the safety threshold, triggering obstacle avoidance when the risk value exceeds the threshold, and returning to a route planning step to regenerate the optimal route; And S5, after the unmanned aerial vehicle reaches a task point, collecting images, extracting light environment information, constructing a light environment fingerprint library, generating a fingerprint sequence, matching the most similar historical fingerprints in the dynamic fingerprint library through cosine similarity, extracting corresponding time sequence data, forming a historical illumination sequence, predicting future illumination intensity, direction and spectrum change based on the sequence by utilizing LSTM, and adjusting aperture, exposure and white balance according to a prediction result to generate optimized light environment parameters.
- 2. The autonomous navigation method for inspection of the network unmanned aerial vehicle based on GPS information and visual information according to claim 1, wherein the specific calculation formula of the radius of the circumcircle is as follows: , Wherein, the Is the center of a circle, r is the radius, 、 、 Is the coordinates of any three points p i 、p j 、p k on the circumscribing circle of the Delaunay triangle network.
- 3. The autonomous navigation method for polling the network unmanned aerial vehicle based on GPS information and visual information according to claim 1, wherein the specific calculation formula for denoising the smooth coordinate sequence by the moving average filtering algorithm is as follows: , Wherein, the For the post-filter coordinates, In order to achieve a sliding window size, Taking the first GPS coordinate sequence as the original GPS coordinate sequence The original coordinates are the end point and the succession The original coordinates are the slave To the point of As the filtered coordinates 。
- 4. The autonomous navigation method for polling the network unmanned aerial vehicle based on GPS information and visual information according to claim 1, wherein the specific calculation formula of the route length is as follows: , Wherein, the Is an adjacent Thiessen polygon And (3) with The euclidean distance between the two, Is a penalty coefficient which is a function of the penalty coefficient, Is the route and the first A penalty value when an obstacle collides, Represents the first And a plurality of obstacles.
- 5. The autonomous navigation method for inspecting a network unmanned aerial vehicle based on GPS information and visual information according to claim 1, wherein the storm layer individual updates the position by means of difference vector weighting, and a specific position calculation formula is as follows: , Wherein, the Is the first After the iteration number of The location of the storm(s) is (are) located, Is the first At the time of iteration, the first The location of the storm(s) is (are) located, Is a random coefficient of the degree of freedom, Is an adjustment factor, for controlling the update amplitude, Is the location index of the random storm, 、 Representing the lower and upper bounds, respectively, of the search space for defining the range of random disturbances.
- 6. The autonomous navigation method for inspecting a network unmanned aerial vehicle based on GPS information and visual information according to claim 1, wherein the thunderstorm layer is corrected according to a tornado layer, and a specific position calculation formula is as follows: , Wherein, the Is the representation of the first At the time of iteration, the first The location of the individual thunderstorm(s), Is the representation of the first In the second iteration The location of the individual thunderstorm(s), And Is a random index, ensures the diversity of searches, Is a random coefficient for adjusting the step size of the movement to the tornado, Is the representation of the first The first iteration The location of the tornado is determined, Is the representation of the first The first iteration The location of the thunderstorm.
- 7. The autonomous navigation method for inspection of the unmanned aerial vehicle of the distribution network based on the GPS information and the visual information according to claim 1, wherein the environmental risk value comprises an identification result and an algorithm environmental risk, and a specific calculation formula is as follows: , Wherein, the Is obstacle recognition confidence; is the obstacle recognition frame size; , , respectively weight coefficients; Is the obstacle threat level.
- 8. The autonomous navigation method for inspection of a network-equipped unmanned aerial vehicle based on GPS information and visual information according to claim 7, wherein the environmental risk value further comprises different risk avoidance steps: (1) When (when) When the large-size obstacle or bird is identified, the unmanned aerial vehicle of the distribution network needs to be vertically pulled up to avoid danger, and the mechanical response time of the unmanned aerial vehicle of the distribution network is controlled At maximum climbing speed The specific movement formula of the vertical ascent is as follows: , Wherein, the , The unmanned aerial vehicle height is matched with the network after risk avoidance and before risk avoidance; Is the vertical pulling speed of the unmanned aerial vehicle of the distribution network, Is the duration of the risk avoidance maneuver; (2) When (when) When a tree or an interweaved cable is identified, the unmanned aerial vehicle with the distribution network needs to realize detour and slow lifting danger avoidance according to the detour radius, and a specific position movement formula is as follows: , Wherein, the Is the distance between the network unmanned plane and the obstacle; The maximum turning radius of the network unmanned aerial vehicle is set; The radius that the unmanned aerial vehicle needs to bypass when avoiding the obstacle is shown; (3) When (when) And when the obstacle with small size is identified, the unmanned aerial vehicle with the distribution network does not need emergency danger avoidance.
- 9. The autonomous navigation method for inspection of a distribution network unmanned aerial vehicle based on GPS information and visual information according to claim 1, wherein the light environment fingerprint library comprises environment illumination intensity, illumination direction angle and spectrum distribution duty ratio The dynamic fingerprint of each light environment scene is defined as a multidimensional feature vector, and the specific formula is as follows: , Wherein, the The light intensity after normalization is used for obtaining the light intensity, Is normalized the angle of the direction of illumination of the light, The ratio of each spectrum field is that the unmanned aerial vehicle of the distribution network collects new light environment fingerprint Later, the existing light environment fingerprint of the dynamic light environment library is needed The cosine similarity is matched and compared, and the concrete calculation formula of the cosine similarity is as follows: , If cosine similarity Judging that the fingerprint is new, adding the fingerprint into the fingerprint library, if the fingerprint is cosine similarity Similar fingerprints exist in the light environment fingerprint library, the light environment fingerprint is discarded, Is a cosine similarity threshold.
- 10. The autonomous navigation method for inspecting the unmanned aerial vehicle of the distribution network based on the GPS information and the visual information according to claim 1, wherein the specific steps of adjusting aperture, exposure and white balance according to the prediction result are as follows: (1) Extracting future illumination intensity from predicted fingerprints The change rate of the gradient of the light intensity in the future is calculated, and the specific calculation formula is as follows: , Wherein, the For the current moment Is used for the normalization of the illumination intensity, Representing the time interval, also called the time step, between two adjacent moments, i.e. the time difference between the two moments before and after, Predicting what time is in the future; the change rate of the light intensity gradient, positive gradient representing the rising of the light intensity, easy overexposure, negative gradient representing the falling of the light intensity, easy weak light, and the adjustment of the aperture photographed by the unmanned aerial vehicle of the distribution network through the future change rate of the light intensity gradient Shutter speed 、 The specific calculation formula of the sensitivity is as follows: , Wherein, the , , , , , The aperture and shutter speed after and before adjustment respectively, A parameter value of the sensitivity; , , Respectively aperture adjustment coefficients, shutter speed adjustment coefficients, A sensitivity adjustment coefficient; (2) Extracting future illumination direction angle from predicted fingerprint The specific calculation formula is as follows: , Wherein, the For the current moment When the normalized illumination direction angle of (2) In the time-course of which the first and second contact surfaces, For the direction angle threshold value, determining the rapid change of the illumination direction, and carrying out additional compensation exposure, wherein the specific adjustment exposure mode is as follows: , Wherein, the Is the adjusted exposure value; is the current exposure value; for compensation coefficient, adding forward compensation when the direction angle changes forward; (3) Extracting future spectral duty cycle from predicted fingerprint The white balance parameter is adjusted according to the spectrum component change, and the specific adjustment calculation mode is as follows: , Wherein, the For the current moment Lower (th) The ratio of the spectral components of the species, Is the white balance value before adjustment; is the white balance value after adjustment; The white balance coefficient corresponding to each spectrum is that the color temperature is reduced when the red light duty ratio is increased, and the color temperature is increased when the blue light duty ratio is increased.
Description
Autonomous navigation method for inspection of distribution network unmanned aerial vehicle based on GPS information and visual information Technical Field The invention relates to the technical field of autonomous patrol navigation of unmanned aerial vehicles of distribution networks, in particular to a method for autonomous patrol navigation of unmanned aerial vehicles of distribution networks based on GPS information and visual information. Background The network distribution unmanned aerial vehicle inspection is to carry visual acquisition equipment by using the unmanned aerial vehicle, and in the complex environment where the network distribution line and related equipment are located, the operation mode of inspecting inspection is carried out according to a preset task or an autonomous planning route, and the core is that the network distribution inspection work is completed by replacing or assisting the manual work through the flexible movement and data acquisition capability of the unmanned aerial vehicle. The development of inspection research of unmanned aerial vehicle of distribution network has important practical significance, and the distribution network is used as a key link for connecting a power system with users, and has wide application scene and is often in complicated scenes such as mountain areas, forests, urban building groups and the like. The traditional network distribution inspection mode is mainly based on manual inspection, workers need to carry detection tools to go to the site of network distribution equipment, equipment states are recorded through modes such as visual inspection and instrument measurement, for scenes such as high towers and mountain-crossing lines, complicated terrains are needed to be traversed by means of ascending equipment or walking, inspection period is long, labor intensity is high, the influence of weather and terrains is remarkable, data acquisition integrity and timeliness are difficult to ensure, even if part of scenes introduce unmanned aerial vehicle inspection, a simple preset route flight mode is adopted, route optimization is carried out without combining with environmental information, visual parameter adjustment depends on manual presetting rather than dynamic self-adaption, the recognition of obstacles is carried out by adopting a traditional image recognition algorithm, recognition accuracy and risk avoidance response speed are insufficient, and the whole inspection effect and safety are required to be improved. However, when the current network unmanned plane executes the inspection task in a real and complex environment, a plurality of problems to be solved are still faced. In the aspect of obstacle avoidance, the unmanned aerial vehicle of the distribution network has weak sensing, identifying and avoiding response capability to surrounding obstacles, is difficult to judge the position and movement trend of the obstacles rapidly and accurately, is easy to generate collision risk, can damage equipment, can interrupt the inspection task and influence work propulsion. In the aspect of visual adaptation, the network unmanned aerial vehicle lacks the ability of shooting system parameter according to ambient brightness dynamic adjustment, when facing different luminance scenes such as highlight, dim light, backlight, can't self-adaptation match optimum shooting luminance parameter for the inspection image of shooting easily appears overexposure, dim light problem, influences the quality of inspection task point image. Disclosure of Invention The invention aims to solve the defects in the prior art, and provides a network unmanned aerial vehicle inspection autonomous navigation method based on GPS information and visual information. In order to achieve the purpose, the invention adopts the following technical scheme that the network unmanned aerial vehicle inspection autonomous navigation method based on GPS information and visual information comprises the following steps: S1, acquiring GPS coordinates of a patrol task point and surrounding obstacles, calling an airborne GPS to acquire coordinates of a flying spot, adopting a sliding average filtering algorithm to smooth a coordinate sequence for denoising, and utilizing Gaussian projection to convert longitude and latitude into plane rectangular coordinates to generate planarization coordinate data; S2, constructing a Delaunay triangle network based on the planarization coordinate data to determine the topological relation of adjacent points, forming Thiessen polygon dividing regions according to the radius of a circumcircle, numbering each region, randomly scrambling indexes through a Fisher-Yates shuffling algorithm, and generating a patrol region index sequence; S3, calculating the sum of Euclidean distances among task points as the route length according to the inspection area index sequence, taking the reciprocal of the sum as an fitness value, sorting the population, dividing the popula