CN-121977497-A - Unmanned aerial vehicle attitude angle measurement method based on horizon detection
Abstract
The invention provides an unmanned aerial vehicle attitude angle measurement method based on horizon detection, which comprises the steps of defining horizon key points, constructing an end-to-end horizon key point detection neural network, wherein the end-to-end horizon key point detection neural network comprises a main network, a transposed convolution network and a self-attention module, predicting horizon key point coordinates according to an end-to-end horizon key point coordinate feature map, acquiring horizon lines according to the predicted horizon key point coordinates, and acquiring the rolling angle and the pitch angle of an unmanned aerial vehicle according to the imaging relation of the horizon lines in a camera. The invention can solve the technical problem that the attitude of the micro inertial navigation system is difficult to accurately acquire under the air flight scene in the prior art, so that the air alignment precision is low.
Inventors
- JIAO HAO
- ZHU MIN
- SUN FUZE
- WANG YUANYUAN
- ZHANG JIAN
- XU CE
- HU GUANGFENG
Assignees
- 北京自动化控制设备研究所
Dates
- Publication Date
- 20260505
- Application Date
- 20251231
Claims (7)
- 1. The unmanned aerial vehicle attitude angle measuring method based on the horizon line detection is characterized by comprising the following steps of: Defining a horizon key point, namely taking a horizon starting point and a horizon end point as visual characteristics, and modeling horizon recognition as a key point detection problem; constructing an end-to-end horizon key point detection neural network, wherein the end-to-end horizon key point detection neural network comprises a trunk network, a transposed convolution network and a self-attention module; The input image is subjected to feature extraction of a backbone network to obtain 1024-channel high-dimensional abstract features with an OS of 32, the dimension of the feature image is restored to OS of 4 after a transposed convolution network, and feature similarity transformation is performed by a self-attention module to obtain an end-to-end horizon key point coordinate feature image, wherein the end-to-end horizon key point coordinate feature image comprises channels for predicting horizon key point coordinates; predicting the coordinates of the horizon key points according to the end-to-end horizon key point coordinate feature map; Acquiring a horizon according to the predicted horizon key point coordinates; and acquiring the rolling angle and the pitch angle of the unmanned aerial vehicle according to the imaging relation of the horizon in the camera.
- 2. The unmanned aerial vehicle attitude angle measurement method based on horizon detection according to claim 1, wherein the backbone network adopts a pooling layer and a full connection layer of DenseNet-121 network mapping classification vectors.
- 3. The unmanned aerial vehicle attitude angle measurement method based on horizon detection according to claim 1, wherein the transposed convolution network uses three-stage transposed convolution to restore the os=32 feature map size output by the backbone network to os=4, and uses 3×3 depth separable convolution to perform convolution operation on the input feature map channel by channel.
- 4. The unmanned aerial vehicle attitude angle measurement method based on horizon detection according to claim 3, wherein the self-attention module performs matrix multiplication on a feature map before 3×3 depth separable convolution and a feature map after 3×3 depth separable convolution to obtain an attention heat map with a size of os=4,256 channels, and fusion of global similarity information of feature maps horizon is achieved.
- 5. The unmanned aerial vehicle attitude angle measurement method based on horizon detection according to claim 4, wherein the attention heat map uses 1 x 1 convolution to map key point coordinates and connection relations, and obtains a key point coordinate feature map F c .
- 6. The unmanned aerial vehicle attitude angle measurement method based on horizon detection according to claim 1, wherein the roll angle γ of the unmanned aerial vehicle is based on Obtaining, wherein the projection of the horizon L on the imaging plane of the camera is L, and the projection straight line direction vector of L under the coordinate system of the camera is And The coordinates of l C projected on the X-axis and Y-axis of the camera coordinate system, respectively.
- 7. The unmanned aerial vehicle attitude angle measurement method based on horizon detection according to claim 6, wherein the pitch angle α of the unmanned aerial vehicle is based on And obtaining, wherein u and v are physical coordinates of a point p on an image plane, the point p is the projection of the midpoint of the horizon L on the image plane, and f is the focal length of the camera.
Description
Unmanned aerial vehicle attitude angle measurement method based on horizon detection Technical Field The invention belongs to the technical field of attitude calculation, and particularly relates to an unmanned aerial vehicle attitude angle measurement method based on horizon detection. Background At present, the war morphology is rapidly evolving towards intelligent war under the promotion of a new technological revolution. The intelligent equipment can effectively enhance the battlefield transparency, reduce the casualties and has remarkable military value. However, in a complex combat environment (building, tunnel and indoor), satellite refusing conditions become normal, and the traditional satellite-based positioning technology is greatly restricted, so that the high-precision autonomous navigation positioning requirement of the complex environment is urgent. Inertial navigation is the most typical autonomous navigation technique, which is the only navigation technique that is autonomous, real-time, continuous and free of environmental limitations, but has the disadvantage that navigation errors diverge over time. In particular, in an air flight scene, the micro inertial navigation system is difficult to acquire accurate gestures, and therefore alignment cannot be completed, so that navigation information is seriously distorted. Disclosure of Invention The present invention aims to solve at least one of the technical problems existing in the prior art. The invention provides an unmanned aerial vehicle attitude angle measuring method based on horizon detection, which comprises the following steps: Defining a horizon key point, namely taking a horizon starting point and a horizon end point as visual characteristics, and modeling horizon recognition as a key point detection problem; constructing an end-to-end horizon key point detection neural network, wherein the end-to-end horizon key point detection neural network comprises a trunk network, a transposed convolution network and a self-attention module; The input image is subjected to feature extraction of a backbone network to obtain 1024-channel high-dimensional abstract features with an OS of 32, the dimension of the feature image is restored to OS of 4 after a transposed convolution network, and feature similarity transformation is performed by a self-attention module to obtain an end-to-end horizon key point coordinate feature image, wherein the end-to-end horizon key point coordinate feature image comprises channels for predicting horizon key point coordinates; predicting the coordinates of the horizon key points according to the end-to-end horizon key point coordinate feature map; Acquiring a horizon according to the predicted horizon key point coordinates; and acquiring the rolling angle and the pitch angle of the unmanned aerial vehicle according to the imaging relation of the horizon in the camera. Further, the backbone network employs a DenseNet-121 network mapping classification vector pooling layer and full connection layer. Further, the transposed convolution network uses three-stage transposed convolution to restore the os=32 feature map size output by the backbone network to os=4, and performs a convolution operation on the input feature map channel by channel using 3×3 depth separable convolution. Further, the self-attention module performs matrix multiplication on the feature map before the 3×3 depth separable convolution and the feature map after the 3×3 depth separable convolution to obtain an attention heat map with a size of os=4,256 channels, so as to realize fusion of global similarity information of the feature map horizon. Further, the attention heat map uses 1×1 convolution to map the coordinates of the key points and the connection relationship, and obtains a feature map F c of the coordinates of the key points. Further, the roll angle gamma of the unmanned aerial vehicle is according toObtaining, wherein the projection of the horizon L on the imaging plane of the camera is L, and the projection straight line direction vector of L under the coordinate system of the camera isAndThe coordinates of l C projected on the X-axis and Y-axis of the camera coordinate system, respectively. Further, the pitch angle alpha of the unmanned aerial vehicle is according toAnd obtaining, wherein u and v are physical coordinates of a point p on an image plane, the point p is the projection of the midpoint of the horizon L on the image plane, and f is the focal length of the camera. By applying the technical scheme of the invention, the invention provides an unmanned aerial vehicle attitude angle measurement method based on horizon detection, which predicts horizon key point coordinates through an end-to-end horizon key point detection neural network, and further obtaining a horizon, and obtaining the rolling angle and the pitch angle of the unmanned aerial vehicle according to the imaging relation of the horizon in the camera. The method provides