Search

CN-121977541-A - Intelligent multi-source navigation method based on inertial and Beidou/visual information fusion

CN121977541ACN 121977541 ACN121977541 ACN 121977541ACN-121977541-A

Abstract

An intelligent multi-source navigation method based on inertial and Beidou/visual information fusion is characterized by firstly establishing a multi-source information system of an inertial navigation system, embedding an IMU multi-parameter on-line calibration mechanism based on carrier dynamics in inertial navigation calculation, establishing a tight coupling fusion framework based on INS calculation results, respectively constructing tight combination measurement with BDS original observed quantity by using INS high-frequency pose information, constructing re-projection error measurement with visual feature points, adopting an intelligent self-adaptive filter based on inertial dynamics error model constraint, taking an INS error equation as a state prediction model, and introducing a Sage-Husa algorithm to estimate system noise statistical characteristics in real time.

Inventors

  • GAO ZHIQIANG
  • MENG FANCHEN
  • Feng Huishuo
  • ZHOU RUIYANG
  • ZHAO XUEWEI
  • WANG DEYAN
  • Nan Zihan

Assignees

  • 北京航天控制仪器研究所

Dates

Publication Date
20260505
Application Date
20251218

Claims (9)

  1. 1. An intelligent multi-source navigation method based on inertial and Beidou/visual information fusion is characterized by comprising the following steps: Acquiring an angle increment and a speed increment of an Inertial Measurement Unit (IMU) in an inertial navigation system in real time, and acquiring an original observed quantity image and a visual time sequence image of a Beidou satellite navigation system (BDS); Establishing an inertial navigation differential equation, embedding an IMU multi-parameter calibration and compensation mechanism in an inertial recursion calculation process, performing inertial recursion calculation, and performing real-time estimation and feedback compensation on zero offset and scale factor errors of a gyroscope and an accelerometer to obtain carrier attitude, speed and position information at the current moment; according to the carrier posture, speed and position information, the high-frequency posture information obtained through inertia recursion calculation is respectively and tightly combined with BDS original observed quantity construction measurement, and re-projection error measurement is constructed with vision characteristic points; The inertial dynamics error model is used as constraint, an adaptive filter is constructed, the inertial dynamics error model is used as a state prediction model, and tight combination measurement is used as an observation update quantity for estimating and adjusting the system noise of the inertial navigation system in real time; And estimating IMU errors and navigation error parameters by using the adaptive filter, feeding back the IMU errors and the navigation error parameters to inertial dynamics navigation solution through closed loop correction, and carrying out real-time correction and error suppression processing on the navigation solution result to obtain a corrected navigation result.
  2. 2. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 1 is characterized by comprising the following steps: When realizing real-time estimation and feedback compensation of zero offset and scale factor errors of a gyroscope and an accelerometer by adopting Kalman filtering, the inertial measurement unit IMU error parameters are amplified to inertial navigation error state vectors to realize observability real-time estimation, and the amplified state vectors are as follows: wherein phi is the misalignment angle, δv is the velocity error, δP is the position error, ε b is the gyro zero offset, b For accelerometer zero bias, δs g and δs a are gyro and accelerometer scale factor errors, respectively, constituting a 21-dimensional error state vector.
  3. 3. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 1 is characterized by comprising the following steps: in the tight combination measurement process, when the Beidou satellite navigation system signal is unlocked and enters a pure inertia or vision auxiliary mode, the physical scale information output by the speed of the inertial navigation system is utilized to carry out scale recovery on the scale-free relative displacement of monocular vision calculation, and the scale factor s required by the scale recovery is as follows: wherein v ins is the speed of INS solution, Δt is the time interval, Δp is the scale-free displacement vector of vision camera solution; The scale factor is used to normalize the displacement of the visual estimate and the map for maintaining the scale accuracy of the inertial navigation system during the absence of satellite signals.
  4. 4. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 1 is characterized by comprising the following steps: In the process of acquiring high-frequency pose information by inertia recursion calculation, the high-frequency pose information is a predicted pixel coordinate, and the last moment pose matrix is obtained by inertia recursion calculation And position At the current moment And (3) with Converting the three-dimensional coordinates of the characteristic points under the k moment camera coordinate system acquired by the visual front end of the inertial navigation system into the k+1 moment camera coordinate system to acquire predicted pixel coordinates The method comprises the following steps: Wherein K is a camera reference matrix, For the mounting matrix of the camera to the IMU, For the inverse matrix thereof, 、 Is a depth estimate.
  5. 5. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 4 is characterized by comprising the following steps: the method for acquiring the vision re-projection error comprises the following steps: based on predicted pixel coordinates Pixel coordinates actually observed at time k+1 The calculation and acquisition method comprises the following steps: In the formula, Is a visual re-projection error.
  6. 6. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 4 is characterized by comprising the following steps: after taking the tightly combined measurement as the observation updating quantity, introducing a Sage-Husa self-adaptive algorithm to estimate the system noise in real time, wherein the method comprises the following steps: Estimating and adjusting a system process noise covariance matrix Q and a measurement noise covariance matrix R in real time through a Sage-Husa self-adaptive algorithm, and taking the system noise as the system noise according to double matrix output, wherein the process noise covariance matrix Q and the measurement noise covariance matrix R are respectively: In the formula, For the filtering innovation at time k, For the measurement matrix, P is the state estimation error covariance matrix, K k is the filter gain matrix, Φ k|k-1 is the state transition matrix, and d k is the forgetting factor.
  7. 7. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 6, wherein the intelligent multi-source navigation method is characterized by comprising the following steps: The self-adaptive filter is provided with a fault detection and isolation FDI mechanism, short-term prediction results of the inertial navigation system are used as input, chi-square test statistics of the Beidou satellite navigation system and vision measurement information are calculated, when chi-square test statistics of any information source continuously exceed a preset threshold value, faults of the chi-square test statistics are judged, and a measurement noise covariance matrix R in the self-adaptive filter is self-adaptively adjusted.
  8. 8. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 6, wherein the intelligent multi-source navigation method is characterized by comprising the following steps: The navigation error parameters are used for full-state feedback correction, the navigation error parameters estimated through the adaptive filter comprise a misalignment angle phi, a speed error delta v and a position error delta P , and are all used for correcting corresponding state quantities in inertial navigation calculation, and the estimated IMU errors are fed back to a preprocessing link of IMU original data for real-time compensation.
  9. 9. The intelligent multi-source navigation method based on inertial and Beidou/visual information fusion of claim 7 is characterized by comprising the following steps: the adaptive filter adopts a closed-loop correction architecture, and the error suppression processing method through the closed-loop correction architecture comprises the following steps: The estimated IMU error parameter epsilon b , b 、δκ g 、δκ a And feeding back the navigation error parameters phi and delta v 、δ P to the INS mechanical arrangement front end in real time for compensation, and inputting the navigation error parameters phi and delta v 、δ P into INS calculation in a feedback correction mode to form a full-closed loop error suppression mechanism.

Description

Intelligent multi-source navigation method based on inertial and Beidou/visual information fusion Technical Field The invention relates to an intelligent multi-source navigation method based on inertial and Beidou/visual information fusion, and belongs to the technical field of multi-source navigation. Background The navigation positioning technology is a key support in the fields of aviation, aerospace, navigation, unmanned systems and the like, a single navigation system is limited by the principle of the single navigation system, and is difficult to keep continuous high precision and high reliability in a complex environment, so that the multi-source information fusion technology becomes a necessary choice. The inertial navigation system INS becomes the core of the integrated navigation by virtue of autonomy, high-frequency output and strong anti-interference capability. However, the integration-based solver causes the accumulation of sensor errors over time and cannot independently work for a long time, and error divergence can be effectively restrained through combination of the BDS and the INS of the Beidou satellite navigation system. The traditional loose combination algorithm is simple but has poor robustness, the tight combination improves the performance by fusing original observables, but fails in a satellite signal long-term rejection environment, the relative pose is calculated by visual navigation through image features, and inertial drift can be compensated by combining the visual navigation with the INS. However, the method has scale uncertainty, is easily influenced by environmental texture and illumination variation, and has the characteristics of tracking stability and real-time performance. The prior multi-source fusion navigation has the problems that (1) a loose coupling or federal filtering structure is adopted, a depth fusion framework taking INS mechanical arrangement as a core and BDS original observed quantity and visual characteristics as tightly coupled measurement sources cannot be established, (2) a fixed noise statistical model is adopted in a fusion algorithm, and an adaptation mechanism for BDS observation quality degradation, visual characteristic tracking stability change and other variable factors is lacked, so that estimation accuracy in a dynamic environment is reduced, even filtering and divergence are caused, (3) an intelligent fault detection and isolation mechanism taking INS short-term prediction as a reference is lacked, autonomous optimization of fusion weights and seamless switching of navigation modes cannot be realized when BDS/visual sensor performance is degraded or fails, and (4) an effective on-line calibration and compensation means for deep parameters affecting the accuracy of INS core solution such as IMU scale factor errors is lacked. Disclosure of Invention Aiming at the problem that effective calibration and compensation means are not available in the prior art, the invention provides an intelligent multi-source navigation method based on inertial and Beidou/visual information fusion. The invention solves the technical problems by the following technical proposal: an intelligent multi-source navigation method based on inertial and Beidou/visual information fusion comprises the following steps: Acquiring an angle increment and a speed increment of an Inertial Measurement Unit (IMU) in an inertial navigation system in real time, and acquiring an original observed quantity image and a visual time sequence image of a Beidou satellite navigation system (BDS); Establishing an inertial navigation differential equation, embedding an IMU multi-parameter calibration and compensation mechanism in an inertial recursion calculation process, performing inertial recursion calculation, and performing real-time estimation and feedback compensation on zero offset and scale factor errors of a gyroscope and an accelerometer to obtain carrier attitude, speed and position information at the current moment; according to the carrier posture, speed and position information, the high-frequency posture information obtained through inertia recursion calculation is respectively and tightly combined with BDS original observed quantity construction measurement, and re-projection error measurement is constructed with vision characteristic points; The inertial dynamics error model is used as constraint, an adaptive filter is constructed, the inertial dynamics error model is used as a state prediction model, and tight combination measurement is used as an observation update quantity for estimating and adjusting the system noise of the inertial navigation system in real time; And estimating IMU errors and navigation error parameters by using the adaptive filter, feeding back the IMU errors and the navigation error parameters to inertial dynamics navigation solution through closed loop correction, and carrying out real-time correction and error suppression processing on the navigatio