Search

US-12618679-B2 - Terrain referenced navigation system

US12618679B2US 12618679 B2US12618679 B2US 12618679B2US-12618679-B2

Abstract

A navigation system for an aircraft includes a microelectromechanical systems inertial measurement unit (MEMS-IMU), a camera that captures, over time, a series of images of the terrain below the aircraft, a terrain map comprising terrain elevation data, an inertial navigation system (INS) and an optical terrain-referenced navigation unit (O-TRN). The INS generates a first position estimate based on one or more signals output by the MEMS-IMU. The O-TRN identifies a terrain feature in an image captured by the camera, derives a terrain height estimate of the terrain feature, and generates a second position estimate based on a comparison between the terrain height estimate and terrain elevation data extracted from the terrain map. The navigation system generates a third position estimate based on the first position estimate and the second position estimate.

Inventors

  • Geoffrey Thomas HENDERSON

Assignees

  • ATLANTIC INERTIAL SYSTEMS LIMITED

Dates

Publication Date
20260505
Application Date
20231213
Priority Date
20221216

Claims (14)

  1. 1 . A navigation system for an aircraft, comprising: a microelectromechanical systems inertial measurement unit (MEMS-IMU); a camera arranged to capture, over time, a series of images of the terrain below the aircraft; a terrain map comprising terrain elevation data; an inertial navigation system (INS) arranged to generate a first position estimate based on one or more signals output by the MEMS-IMU; and an optical terrain-referenced navigation unit (O-TRN) configured to: identify a terrain feature in two or more images captured by the camera; calculate a flow rate at which the terrain feature passes below the aircraft based on said two or more images; derive a terrain height estimate of the terrain feature based on the calculated flow rate; and generate a second position estimate based on a comparison between the terrain height estimate and terrain elevation data extracted from the terrain map, wherein the navigation system is configured to generate a third position estimate based on the first position estimate and the second position estimate.
  2. 2 . The navigation system as claimed in claim 1 , wherein the navigation system further includes: an iterative algorithm unit (IAU) arranged to determine a system state array in each iteration, the system state array comprising a plurality of state variables, wherein the IAU is configured, in each iteration, to: update the system state array for the next iteration based on the current system state array, the first position estimate and the second position estimate; and generate the third position estimate based on the updated system state array.
  3. 3 . The navigation system as claimed in claim 2 , wherein: each state variable comprises an estimate of an error associated with a corresponding system characteristic; and the IAU is arranged, in each iteration, to generate the third position estimate by applying an error correction to the first position estimate based on the updated system state array.
  4. 4 . The navigation system as claimed in claim 2 , wherein: the MEMS-IMU comprises one or more accelerometers and/or gyroscopes; and the system state array comprises at least one state variable corresponding to an alignment error, scale factor error, and/or a bias error of an accelerometer and/or gyroscope.
  5. 5 . The navigation system as claimed in claim 2 , wherein: the INS comprises an integrator; and the system state array comprises at least one state variable associated with an operational parameter of the integrator.
  6. 6 . The navigation system as claimed in claim 5 , wherein the navigation system is arranged to control the integrator based on the current state variable associated with the operational parameter of the integrator in the system state array.
  7. 7 . The navigation system as claimed in claim 2 , wherein the system state array comprises state variables corresponding to one or more of: up to three angle or tilt errors; up to three velocity errors; up to three position errors; up to three gyroscope bias errors; up to three accelerometer bias errors; up to four, or more, operational parameters or coefficients of integrators; up to three accelerometer scale factor errors; up to three gyroscope scale factor errors; up to six accelerometer alignment errors; up to six gyroscope alignment errors; and up to nine gyroscope acceleration- or g-sensitivity errors.
  8. 8 . The navigation system as claimed in claim 2 , wherein the IAU comprises a Kalman filter.
  9. 9 . The navigation system as claimed in claim 1 , wherein: the INS is arranged to output a first velocity estimate and a first altitude estimate based on one or more signals output by the MEMS-IMU; and the O-TRN is arranged to derive the terrain height estimate further based on the first velocity estimate and the first altitude estimate.
  10. 10 . The navigation system as claimed in claim 1 , further comprising: a barometric altimeter; wherein the O-TRN is arranged to derive the terrain height estimate further based on a second altitude estimate derived from an output of the barometric altimeter.
  11. 11 . The navigation system as claimed in claim 1 , wherein the O-TRN is configured to: identify a plurality of terrain features in a plurality of images captured by the camera; derive a respective terrain height estimate of each of the plurality of terrain features; and output the second position estimate based on a correlation between the plurality of terrain height estimates and terrain elevation data extracted from the terrain map.
  12. 12 . The navigation system as claimed in claim 2 , wherein: the system state array comprises greater than fifteen state variables, optionally at least twenty state variables, optionally at least thirty state variables, optionally at least forty state variables, optionally at least forty-five state variables.
  13. 13 . The navigation system as claimed in claim 1 , wherein the camera comprises a short-wave infra-red camera.
  14. 14 . A method of navigating an aircraft, the method comprising: deriving a first position estimate based on one or more signals output by a microelectromechanical systems inertial measurement unit (MEMS-IMU); capturing, over time, a series of images of the terrain below the aircraft using a camera; identifying a terrain feature two or more images captured by the camera; calculating a flow rate at which the terrain feature passes below the aircraft based on said two or more images; deriving a terrain height estimate of the terrain feature based on the calculated flow rate; deriving a second position estimate based on a comparison between the terrain height estimate and terrain elevation data extracted from a terrain map; and deriving a third position estimate based on the first position estimate and the second position estimate.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application claims priority to European Patent Application No. 22275163.8 filed Dec. 16, 2022, the entire contents of which is incorporated herein by reference. TECHNICAL FIELD The present disclosure relates to the field of navigation systems, particularly navigation systems for use on aircraft. BACKGROUND Inertial Navigation Systems (INS), which are in service on a number of land, sea and air vehicles, use inertial measurement sensors (e.g. gyroscopes, accelerometers, etc.) to provide a navigation solution based on the principles of dead reckoning. However, position estimates obtained from INSs typically drift over time due to the accumulation of errors in the outputs of accelerometers and gyroscopes. In order to mitigate this to some extent, high quality, e.g. navigation grade INSs typically require the use of high-precision, low-drift inertial sensors which are generally physically large and expensive to manufacture. In order to further counteract (or correct for) the drift of INS position estimates they are often used in conjunction with other sources of position estimate such as Global Navigation Satellite Systems such as GPS, Galileo, GLONASS, etc. However, satellite-based navigation systems are not reliable: signals can be jammed, spoofed, blocked, etc. which can lead to problems with navigation systems that rely too heavily on these satellite-based navigation systems. In view of this, some navigation systems use terrain-referenced navigation systems (TRN) in conjunction with an INS in order to counteract (or correct for) INS drift, particularly though not exclusively on aircraft. Such systems typically use radar altimeters to estimate terrain elevation beneath an aircraft. Measured terrain elevation estimates are then correlated with stored terrain elevation data (e.g. a Digital Terrain Elevation Database, DTED) along the aircraft's travel path in order to produce a navigation solution. SUMMARY When viewed from a first aspect this disclosure provides a navigation system for an aircraft. The system includes: a microelectromechanical systems inertial measurement unit (MEMS-IMU); a camera arranged to capture, over time, a series of images of the terrain below the aircraft; a terrain map comprising terrain elevation data; an inertial navigation system, INS, arranged to generate a first position estimate based on one or more signals output by the MEMS-IMU; and an optical terrain-referenced navigation unit (O-TRN). The O-TRN is configured to: identify a terrain feature in an image captured by the camera; derive a terrain height estimate of the terrain feature; and generate a second position estimate based on a comparison between the terrain height estimate and terrain elevation data extracted from the terrain map. wherein the navigation system is configured to generate a third position estimate based on the first position estimate and the second position estimate. Inertial MEMS sensors are typically physically smaller than, and significantly cheaper to manufacture than more accurate inertial sensors like ring laser gyroscopes, fibre optic gyroscopes, etc. However, inertial MEMS sensors can suffer from significant errors (e.g. bias errors, scale factor errors, etc.), which can cause position estimates derived from inertial MEMS sensor outputs to drift substantially over time, thereby potentially reducing their usability in navigation systems where accuracy is a key concern. Various techniques have been used to help correct or keep track of the errors in INSs, e.g. using other sources of positional information like GNSS positioning systems. However, GNSS information can become unavailable or unreliable in certain circumstances and so it is desirable to be able to correct the INS system be other means when required. In order to compensate for the errors, the present disclosure combines a position estimate from a MEMS-based INS with a second position estimate from an O-TRN, thereby enabling the system to provide a third, more accurate position estimate for an aircraft. Thus, the present disclosure may provide a navigation system that is physically small and inexpensive to manufacture and that still provides a navigation solution of acceptable accuracy. This may enable the navigation system to be particularly suitable in applications where there is limited space or where added weight is undesirable, e.g. light aircraft like drones and helicopters, though it will be appreciated that the navigation system is equally suitable for other types of aircraft e.g. larger passenger and/or freight aircraft, etc. Furthermore, the present disclosure provides a navigation system which may make use of terrain-referenced navigation (TRN) technology without requiring the use of a radar altimeter (RADALT) to monitor terrain beneath the aircraft like in typical TRN solutions. RADALTs, by design, require electromagnetic radiation to be emitted from the aircraft and reflections from tar