CN-121995345-A - Unmanned aerial vehicle course angle accurate measurement method based on near infrared laser and reflective pattern
Abstract
The invention discloses an unmanned aerial vehicle course angle accurate measurement method based on near infrared laser and a reflection pattern, which utilizes a ground near infrared imaging system to be matched with a geometrical reflection pattern with unique directivity on board, realizes measurement through a Vision Transformer deep learning model, utilizes a large number of unlabeled images, utilizes manual application of random rotation and model prediction rotation angle to enable the model to acquire the direction characteristics of the reflection pattern under the condition of zero cost labeling, and then realizes high-precision measurement through small amount of supervised data fine adjustment. The method solves the problems of difficult course measurement and high labeling data acquisition cost of the unmanned aerial vehicle in night and low light environment, and has the advantages of strong anti-interference capability, simple system deployment and the like.
Inventors
- YANG TAO
- YANG JIAXIN
- BAO HONGWEI
- GENG NING
- LI DONGDONG
- LI JING
- ZHANG YISU
- WANG FEIYA
Assignees
- 西北工业大学
Dates
- Publication Date
- 20260508
- Application Date
- 20260313
Claims (10)
- 1. The unmanned aerial vehicle course angle accurate measurement method based on the near infrared laser and the reflective pattern is characterized by comprising the following steps of: step 1, constructing a ground near infrared imaging and lighting system; Step 2, designing and arranging an onboard reflective coding pattern; step 3, data acquisition and self-supervision pre-training data set construction; step 4, constructing and training based on the angle detection model Vision Transformer; And 5, accurately measuring the course angle on line.
- 2. The method for accurately measuring the heading angle of the unmanned aerial vehicle based on the near infrared laser and the light reflection pattern according to claim 1, wherein the step 1 is specifically as follows: The ground equipment comprises a near infrared laser lamp, a high-resolution long-focus lens and an industrial camera, wherein a narrow-band filter is additionally arranged in front of the long-focus lens, the central wavelength of the narrow-band filter is consistent with the wavelength of the near infrared laser lamp, and the narrow-band filter is used for filtering background noise of natural light and visible light wave bands and ensuring that only laser reflection signals are received.
- 3. The method for accurately measuring the heading angle of the unmanned aerial vehicle based on the near infrared laser and the light reflection pattern according to claim 2, wherein the step 2 is specifically as follows: The omnidirectional reflection strips are stuck to form coding patterns with unique characteristics, so that when the unmanned aerial vehicle is observed from a ground view angle, the image characteristics of the omnidirectional reflection strips displayed at different rotation angles are unique, and the current absolute course angle of the unmanned aerial vehicle is reversely solved through the images.
- 4. The method for accurately measuring the heading angle of the unmanned aerial vehicle based on the near infrared laser and the light reflection pattern according to claim 3, wherein the step 3 is specifically as follows: The method comprises the steps of collecting an unmanned plane reflection strip image by using a built ground near infrared imaging and lighting system, applying random rotation and zero filling to generate a training sample by adopting a self-supervision learning strategy generated based on rotation, namely taking any image as a reference, and enabling a model to learn geometric topological characteristics of coding patterns changing along with angles by taking an applied rotation angle as a pseudo-label.
- 5. The method for accurately measuring the heading angle of the unmanned aerial vehicle based on the near infrared laser and the light reflection pattern according to claim 4, wherein the step 4 is specifically as follows: Constructing a Vision Transformer-based deep neural network; a) The model architecture is that the collected coding pattern is segmented into Patches with fixed size, mapped into vector sequences through linear projection, and added with position codes; b) The training process is divided into two stages; the first stage uses the data collected in step ③ to conduct self-supervision pre-training to optimize the encoder parameters, the second stage loads pre-training weights, conducts supervision fine tuning on the data set with the angle true value, and the output layer is changed into a regression head to directly predict the heading angle value theta.
- 6. The method for accurately measuring the heading angle of the unmanned aerial vehicle based on the near infrared laser and the light reflection pattern according to claim 5, wherein the step 5 is specifically as follows: The method comprises the steps of using a ground near infrared laser lamp to irradiate an unmanned aerial vehicle, enabling a long-focus camera to capture a high-contrast reflection strip image through a light filter, inputting the image into a trained angle detection model based on Vision Transformer, outputting a predicted course angle in real time, and sending angle information to a unmanned aerial vehicle flight control system for closed-loop control or to a ground station for display through a wireless data transmission link.
- 7. The method for accurately measuring the course angle of the unmanned aerial vehicle based on the near infrared laser and the light reflection pattern according to claim 6 is characterized in that an N-order rotation symmetry axis does not exist in a two-dimensional plane, N is larger than or equal to 2, namely, the pattern coincides with an original image only at 0 degree in the process of rotating the pattern around the geometric center of the pattern by 0 degree, from the angle of mathematical definition, the geometric centroid of the pattern is forcedly set to be not coincident with the minimum circumcircle center, the Euclidean distance D between the two points is larger than the set proportion of the minimum circumcircle radius, and the unique polar axis direction of the pattern is formed by connecting vectors formed by the circumcircle center and the geometric centroid, so that the absolute course of the unmanned aerial vehicle is physically indicated.
- 8. An electronic device comprising a processor and a memory, the memory for storing a computer program, the processor for executing the computer program stored by the memory to cause the electronic device to perform the method of any one of claims 1 to 7.
- 9. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method according to any one of claims 1 to 7.
- 10. A chip comprising a processor for calling and running a computer program from a memory, causing a device on which the chip is mounted to perform the method of any one of claims 1 to 7.
Description
Unmanned aerial vehicle course angle accurate measurement method based on near infrared laser and reflective pattern Technical Field The invention belongs to the technical field of unmanned aerial vehicles, and particularly relates to an unmanned aerial vehicle course angle accurate measurement method based on near infrared laser and a reflective pattern. Background The accurate course angle acquisition of the unmanned aerial vehicle is the core for realizing accurate navigation, fixed-point hovering and complex maneuvering flight. A common course angle acquisition scheme includes 1) a magnetic compass. This is the cheapest and common solution, but magnetic compasses are extremely susceptible to surrounding ferromagnetic substances and electromagnetic interference, leading to drift or even failure of readings. 2) GPS/RTK dual antenna direction finding. Although the accuracy is high, the weight and cost of the onboard equipment are increased, and the device cannot work in an environment where satellite signals are blocked. 3) Vision based SLAM or VIO. The method relies on environmental texture features, is easy to lose tracking in night, dim light or texture repeated scenes, and has huge consumption on airborne computing resources. In the prior art, a remote high-precision course measuring means which does not depend on external satellite signals and does not occupy airborne computing resources under night or dim light conditions is lacking Disclosure of Invention In order to overcome the defects of the prior art, the invention provides an unmanned aerial vehicle course angle accurate measurement method based on near infrared laser and a reflection pattern, which utilizes a ground near infrared imaging system to match with a geometrical reflection pattern with unique directivity on the machine, and realizes measurement through a Vision Transformer deep learning model; by utilizing a large number of unlabeled images, random rotation is manually applied, and the model predicts the rotation angle, so that the model obtains the directional characteristics of the reflective pattern under the condition of zero cost labeling, and then high-precision measurement can be realized by fine adjustment of a small amount of supervised data. The method solves the problems of difficult course measurement and high labeling data acquisition cost of the unmanned aerial vehicle in night and low light environment, and has the advantages of strong anti-interference capability, simple system deployment and the like. The technical scheme adopted for solving the technical problems is as follows: step 1, constructing a ground near infrared imaging and lighting system; Step 2, designing and arranging an onboard reflective coding pattern; step 3, data acquisition and self-supervision pre-training data set construction; step 4, constructing and training based on the angle detection model Vision Transformer; And 5, accurately measuring the course angle on line. Preferably, the step 1 specifically includes: The ground equipment comprises a near infrared laser lamp, a high-resolution long-focus lens and an industrial camera, wherein a narrow-band filter is additionally arranged in front of the long-focus lens, the central wavelength of the narrow-band filter is consistent with the wavelength of the near infrared laser lamp, and the narrow-band filter is used for filtering background noise of natural light and visible light wave bands and ensuring that only laser reflection signals are received. Preferably, the step 2 specifically includes: The omnidirectional reflection strips are stuck to form coding patterns with unique characteristics, so that when the unmanned aerial vehicle is observed from a ground view angle, the image characteristics of the omnidirectional reflection strips displayed at different rotation angles are unique, and the current absolute course angle of the unmanned aerial vehicle is reversely solved through the images. Preferably, the step 3 specifically includes: The method comprises the steps of collecting an unmanned plane reflection strip image by using a built ground near infrared imaging and lighting system, applying random rotation and zero filling to generate a training sample by adopting a self-supervision learning strategy generated based on rotation, namely taking any image as a reference, and enabling a model to learn geometric topological characteristics of coding patterns changing along with angles by taking an applied rotation angle as a pseudo-label. Preferably, the step 4 specifically includes: Constructing a Vision Transformer-based deep neural network; a) The model architecture is that the collected coding pattern is segmented into Patches with fixed size, mapped into vector sequences through linear projection, and added with position codes; b) The training process is divided into two stages; the first stage uses the data collected in step ③ to conduct self-supervision pre-training to optimize the enco