CN-121994220-A - Landing zone high-precision inertial vision integrated navigation method
Abstract
The invention provides a landing section high-precision inertial vision integrated navigation method which comprises the following steps of constructing a landing section inertial vision integrated navigation filtering state variable, calculating projection of a relative distance between an inertial navigation position and an airport origin under an airport coordinate system to obtain horizontal position information of visual navigation, constructing a landing section inertial vision integrated navigation filtering observation equation, carrying out inertial vision integrated navigation filtering calculation based on the integrated navigation filtering state variable and the observation equation, and carrying out self-adaptive correction on an integrated navigation position error and a speed error based on a filter estimation effect. The invention plays the advantages of continuous real-time inertial navigation, high-precision relative measurement of visual navigation and high-precision elevation measurement of laser ranging, realizes high-precision and high-reliability three-dimensional navigation positioning in the limited time of the landing section, and improves the autonomous landing capability of the unmanned aerial vehicle in the complex environment.
Inventors
- WANG KANG
- LIU CHONGLIANG
- WEI YONGSHU
- MING LI
- ZHAO YUFEI
- LI ZHI
- ZHAO LIANG
- ZHANG WEIJIAN
- SHANG KEJUN
- HU GUANGFENG
- XU CE
Assignees
- 北京自动化控制设备研究所
Dates
- Publication Date
- 20260508
- Application Date
- 20251229
Claims (7)
- 1. The landing zone high-precision inertial vision integrated navigation method is characterized by constructing a landing zone inertial vision integrated navigation filtering state variable, wherein the filtering state variable comprises inertial navigation position errors, speed errors, attitude errors and device errors; Constructing a landing zone relative navigation model by using inertial navigation information of a geographic coordinate system by taking an airport coordinate system as a reference, calculating projection of a relative distance between an inertial navigation position and an airport origin under the airport coordinate system, and obtaining horizontal position information of visual navigation; Taking horizontal position information obtained by visual navigation and relative height information obtained by laser ranging as observables, and constructing a landing section inertial vision combined navigation filtering observation equation; based on the combined navigation filtering state variable and the observation equation, performing inertial vision combined navigation filtering calculation; And carrying out self-adaptive correction on the combined navigation position error and the speed error based on the filter estimation effect.
- 2. The method for integrated navigation of high precision inertial vision of landing leg according to claim 1, wherein the integrated navigation filter state variable X k of the inertial vision of landing leg is: Wherein, the Δh and δλ are latitude errors, altitude errors and longitude errors of inertial navigation respectively; delta V n ,δV u ,δV e is the north, the sky and the east speed error of inertial navigation respectively, phi n 、φ u 、φ e is the north, the sky and the east misalignment angle of inertial navigation respectively; And epsilon x 、ε y 、ε z is the gyro drift on the inertial navigation carrier coordinate system.
- 3. The landing zone high-precision inertial vision integrated navigation method according to claim 2, wherein the projection of the relative distance between the inertial navigation position and the airport origin under the airport coordinate system is: Wherein, the The matrix is a conversion matrix from a geographic system to an airport system, C ij is the ith row and the jth column of the matrix, R m 、R n is the radius of curvature of a meridian and the radius of curvature of a mortise and tenon circle respectively, For inertial navigation, [ lat O hgt O lon O ] is the location of the airport origin.
- 4. The landing zone high-precision inertial vision integrated navigation method of claim 3, wherein the observed quantity Z k is: Wherein X V 、Z V is the forward, lateral position under the airport coordinate system obtained by visual navigation, H L is the relative ground height obtained by laser ranging, and P ins [1]、P ins [2]、P ins [3] is the forward, vertical and lateral position under the airport coordinate system obtained by inertial navigation. The observation matrix H k is: H k =[H p 0 3×6 0 3×6 ]
- 5. The landing stage high-precision inertial vision integrated navigation method of claim 4, The method is characterized in that the method for calculating the integrated navigation filtering is as follows: State prediction X k,k-1 =Φ k,k-1 X k-1 State prediction variance: Filtering gain :K k =P k,k-1 H k T (H k P k,k-1 H k T +R k ) -1 State estimation X k =X k,k-1 +K k (Z k -H k X k,k-1 State estimation variance :P k =(I-K k H k )P k,k-1 (I-K k H k ) T +K k R k K k T Wherein X k,k-1 is a one-step predicted value of a filter state quantity, X k-1 is a K-1 moment filter state quantity, phi k,k-1 is a combined navigation state transition matrix established based on an inertial navigation error rule, P k,k-1 is an error covariance one-step predicted value, P k-1 is a K-1 moment error covariance, Q k-1 is a K-1 moment system noise matrix, K k is a K moment filter gain matrix, R k is a K moment full-section observation noise matrix, Z k is a K moment observation quantity, H k is a K moment observation matrix, P k is a K moment error covariance, I is an identity matrix, and subscripts K, K-1 respectively represent the current moment and the last moment.
- 6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor performs the steps of the landing stage high precision inertial vision integrated navigation method of any one of claims 1 to 5 when the program is executed.
- 7. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the landing leg high precision inertial vision integrated navigation method of any of claims 1 to 5.
Description
Landing zone high-precision inertial vision integrated navigation method Technical Field The invention belongs to the technical field of navigation, and particularly relates to a landing zone high-precision inertial vision integrated navigation method. Background High-precision and high-reliability navigation positioning is a key guarantee for safe landing of unmanned aerial vehicles. At present, unmanned aerial vehicle landing navigation generally depends on airport ground equipment or an airborne differential satellite positioning system, has higher requirements on airport ground facilities, cannot provide navigation positioning information meeting the requirements of safe landing under the condition of satellite rejection or limitation such as complex electromagnetic interference and the like, and exposes obvious limitations and vulnerabilities. The visual sensor is utilized to image the runway, the relative pose calculation of the unmanned aerial vehicle and the runway can be realized through the recognition extraction of characteristics such as the runway boundary and the starting point and the relative pose calculation, and on the basis, the continuous and real-time autonomous navigation information can be obtained through the fusion with the inertial navigation information, so that the unmanned aerial vehicle is an effective means for improving the autonomous landing navigation capability of the unmanned aerial vehicle under the refusing environment. However, the effect of visual navigation for identifying airport features is influenced by environmental changes such as illumination, seasons and the like, so that the precision and reliability of landing zone inertial visual integrated navigation are challenging. Disclosure of Invention The invention aims to overcome the defects in the prior art and provides a landing zone high-precision inertial vision integrated navigation method. The scheme of the invention can solve the problems in the prior art. The technical solution of the invention is as follows: According to a first aspect, a landing zone high-precision inertial vision integrated navigation method is provided, which comprises the following steps: constructing an inertial vision integrated navigation filtering state variable of a landing zone, wherein the filtering state variable comprises an inertial navigation position error, a speed error, an attitude error and a device error; Constructing a landing zone relative navigation model by using inertial navigation information of a geographic coordinate system by taking an airport coordinate system as a reference, calculating projection of a relative distance between an inertial navigation position and an airport origin under the airport coordinate system, and obtaining horizontal position information of visual navigation; Taking horizontal position information obtained by visual navigation and relative height information obtained by laser ranging as observables, and constructing a landing section inertial vision combined navigation filtering observation equation; based on the combined navigation filtering state variable and the observation equation, performing inertial vision combined navigation filtering calculation; And carrying out self-adaptive correction on the combined navigation position error and the speed error based on the filter estimation effect. Further, the landing section inertial vision combined navigation filtering state variable X k is: Wherein, the Δh and δλ are respectively latitude error, altitude error and longitude error of inertial navigation, δV n,δVu,δVe is north direction, sky direction and east direction speed error of inertial navigation, and phi n、φu、φe is north direction, sky direction and east direction misalignment angle of inertial navigation; And epsilon x、εy、εz is the gyro drift on the inertial navigation carrier coordinate system. Further, the projection of the relative distance between the inertial navigation position and the airport origin under the airport coordinate system is as follows: Wherein, the The matrix is a conversion matrix from a geographic system to an airport system, C ij is the ith row and the jth column of the matrix, R m、Rn is the radius of curvature of a meridian and the radius of curvature of a mortise and tenon circle respectively,For inertial navigation, [ lat O hgtO lonO ] is the location of the airport origin. Further, the observed quantity Z k is: Wherein X V、ZV is the forward, lateral position under the airport coordinate system obtained by visual navigation, H L is the relative ground height obtained by laser ranging, and P ins[1]、Pins[2]、Pins [3] is the forward, vertical and lateral position under the airport coordinate system obtained by inertial navigation. The observation matrix H k is: Hk=[Hp 03×6 03×6] Further, the integrated navigation filtering calculation method comprises the following steps: State prediction X k,k-1=Φk,k-1Xk-1 State prediction variance: Filtering gain :K