CN-121994221-A - Inertial vision integrated navigation error tracking and compensating method
Abstract
The invention provides an inertial vision combined navigation error tracking and compensating method which comprises the following steps of constructing an inertial/vision combined navigation filtering state variable of an unmanned aerial vehicle, constructing an inertial/vision combined navigation filtering observation equation of the unmanned aerial vehicle by taking position information obtained by vision navigation as an observed quantity, constructing a combined navigation error estimation evaluation model based on a historical estimated value, calculating an error estimation evaluation factor lambda k , carrying out inertial/vision combined navigation filtering estimation based on the combined navigation filtering state variable, the observation equation and the error estimation evaluation factor, and carrying out real-time compensation on a combined navigation error according to the obtained inertial/vision combined navigation filtering estimation. The invention can realize effective tracking of visual navigation results under the conditions of state mutation, carrier maneuver and the like, greatly improves the utilization effect of visual navigation observables, and improves the accuracy and efficiency of combined navigation error estimation.
Inventors
- WANG KANG
- LIU CHONGLIANG
- WEI YONGSHU
- ZHAO YUFEI
- MING LI
- LI ZHI
- ZHAO LIANG
- ZHANG WEIJIAN
- SHANG KEJUN
- HU GUANGFENG
- XU CE
Assignees
- 北京自动化控制设备研究所
Dates
- Publication Date
- 20260508
- Application Date
- 20251229
Claims (6)
- 1. The inertial vision integrated navigation error tracking and compensating method is characterized by comprising the following steps: Step 1, constructing an unmanned aerial vehicle inertia/vision combined navigation filtering state variable; Step 2, constructing an unmanned aerial vehicle inertia/vision combined navigation filtering observation equation by taking the position information obtained by vision navigation as an observed quantity; Step 3, establishing a combined navigation error estimation evaluation model based on the historical estimation value, and calculating an error estimation evaluation factor lambda k ; the combined navigation error estimation evaluation model is as follows: Wherein, the Wherein lambda k is an error estimation evaluation factor, eta k is an evaluation coefficient, N k is a residual matrix, M k is a prediction matrix, R k is a k moment observation noise matrix, H k is a k moment observation matrix, Q k-1 is a k-1 moment system noise matrix, phi k,k-1 is a combined navigation state transition matrix, and P k-1 is a k-1 moment error covariance; e k is the output residual; The residual variance matrix at the k-1 moment and the k moment are respectively; the method comprises the steps of predicting a state quantity of a filter in one step, wherein rho is a forgetting factor, beta is a weakening factor, and the values of the rho and the beta are determined according to the flight state and the visual navigation quality. Step 4, based on the combined navigation filtering state variable, the observation equation and the error estimation evaluation factor, performing inertial/visual combined navigation filtering estimation; and 5, compensating the combined navigation error in real time according to the obtained inertial/visual combined navigation filtering estimation.
- 2. The inertial vision integrated navigation error tracking and compensating method according to claim 1, wherein the filtering state variables are: Wherein, the Delta lambda is latitude error and longitude error of inertial navigation respectively, delta V n ,δV e is north direction and east direction speed error of inertial navigation respectively, phi n 、φ u 、φ e is north direction, sky direction and east direction misalignment angle of inertial navigation respectively; And epsilon x 、ε y 、ε z is the gyro drift on the inertial navigation carrier coordinate system.
- 3. The inertial vision integrated navigation error tracking and compensating method according to claim 2, wherein the observed quantity Z k is: Wherein, the Lambda ins is the latitude and longitude calculated by inertial navigation respectively; lambda vns is the latitude and longitude calculated by visual navigation; The observation matrix H k is defined as:
- 4. the inertial vision integrated navigation error tracking and compensating method according to claim 3, wherein the integrated navigation filtering calculation method is as follows: state prediction X k,k-1 =Φ k,k-1 X k-1 State prediction variance: Filtering gain :K k =P k,k-1 H k T (H k P k,k-1 H k T +R k ) -1 State estimation X k =X k,k-1 +K k (Z k -H k X k,k-1 State estimation variance :P k =(I-K k H k )P k,k-1 (I-K k H k ) T +K k R k K k T Wherein X k,k-1 is a one-step predicted value of a filter state quantity, X k-1 is a K-1 moment filter state quantity, phi k,k-1 is a combined navigation state transition matrix established based on an inertial navigation error rule, P k,k-1 is an error covariance one-step predicted value, P k-1 is a K-1 moment error covariance, Q k-1 is a K-1 moment system noise matrix, K k is a K moment filter gain matrix, R k is a K moment full-section observation noise matrix, Z k is a K moment observation quantity, H k is a K moment observation matrix, P k is a K moment error covariance, and I is an identity matrix. In the invention, subscript k, k-1 represents the current time and the previous time respectively.
- 5. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the inertial vision integrated navigation error tracking and compensation method according to any one of claims 1 to 4 when the program is executed by the processor.
- 6. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the inertial vision integrated navigation error tracking and compensation method according to any one of claims 1 to 4.
Description
Inertial vision integrated navigation error tracking and compensating method Technical Field The invention belongs to the technical field of inertial vision integrated navigation, and particularly relates to an inertial vision integrated navigation error tracking and compensating method. Background The inertial vision integrated navigation has the advantages of comprehensive inertial navigation parameters, no regional limitation, high short-time relative precision, high absolute positioning precision of vision navigation and no error divergence with time, and is an effective means for realizing high-precision and strong autonomous navigation positioning of aircrafts such as unmanned aerial vehicles and the like. In the engineering application process, inertial vision integrated navigation is influenced by various factors such as long-time flight of an unmanned aerial vehicle, maneuver of a complex section, different scene landforms and the like, and a filter is difficult to realize stable estimation and accurate compensation of various errors under complex and changeable conditions, so that the integrated navigation effect is obviously influenced by an application scene, and a high-precision high-consideration navigation positioning result cannot be continuously provided. Disclosure of Invention The invention aims to overcome the defects in the prior art and provides an inertial vision combined navigation error tracking and compensating method. The scheme of the invention can solve the problems in the prior art. The technical solution of the invention is as follows: According to a first aspect, there is provided a method for inertial vision integrated navigation error tracking and compensation, comprising the steps of: Step 1, constructing an unmanned aerial vehicle inertia/vision combined navigation filtering state variable; Step 2, constructing an unmanned aerial vehicle inertia/vision combined navigation filtering observation equation by taking the position information obtained by vision navigation as an observed quantity; Step 3, establishing a combined navigation error estimation evaluation model based on the historical estimation value, and calculating an error estimation evaluation factor lambda k; the combined navigation error estimation evaluation model is as follows: Wherein, the Wherein lambda k is an error estimation evaluation factor, eta k is an evaluation coefficient, N k is a residual error matrix, M k is a prediction matrix, R k is a k moment observation noise matrix, H k is a k moment observation matrix, Q k-1 is a k-1 moment system noise matrix, phi k,k-1 is a combined navigation state transition matrix, P k-1 is a k-1 moment error covariance, and e k is an output residual error; The residual variance matrix at the k-1 moment and the k moment are respectively; the method comprises the steps of predicting a state quantity of a filter in one step, wherein rho is a forgetting factor, beta is a weakening factor, and the values of the rho and the beta are determined according to the flight state and the visual navigation quality. Step 4, based on the combined navigation filtering state variable, the observation equation and the error estimation evaluation factor, performing inertial/visual combined navigation filtering estimation; and 5, compensating the combined navigation error in real time according to the obtained inertial/visual combined navigation filtering estimation. Further, the filtering state variables are: Wherein, the Delta lambda is latitude error and longitude error of inertial navigation respectively, delta V n,δVe is north direction and east direction speed error of inertial navigation respectively, phi n、φu、φe is north direction, sky direction and east direction misalignment angle of inertial navigation respectively; And epsilon x、εy、εz is the gyro drift on the inertial navigation carrier coordinate system. Further, the observed quantity Z k is: Wherein, the Lambda ins is the latitude and longitude calculated by inertial navigation respectively; lambda vns is the latitude and longitude calculated by visual navigation; The observation matrix H k is defined as: Further, the integrated navigation filtering calculation method comprises the following steps: state prediction X k,k-1=Φk,k-1Xk-1 State prediction variance: Filtering gain :Kk=Pk,k-1HkT(HkPk,k-1HkT+Rk)-1 State estimation X k=Xk,k-1+Kk(Zk-HkXk,k-1 State estimation variance :Pk=(I-KkHk)Pk,k-1(I-KkHk)T+KkRkKkT Wherein X k,k-1 is a one-step predicted value of a filter state quantity, X k-1 is a K-1 moment filter state quantity, phi k,k-1 is a combined navigation state transition matrix established based on an inertial navigation error rule, P k,k-1 is an error covariance one-step predicted value, P k-1 is a K-1 moment error covariance, Q k-1 is a K-1 moment system noise matrix, K k is a K moment filter gain matrix, R k is a K moment full-section observation noise matrix, Z k is a K moment observation quantity, H k is a K