US-12625256-B2 - Determining localization error
Abstract
The disclosed technology provides solutions for improving the accuracy of localization error estimates and in particular, provides methods for improving the accuracy of error estimates associated with individual localizers. A method of the disclosed technology can include steps for receiving a first location error estimate, corresponding with a first localizer of a first autonomous vehicle (AV), receiving a second location error estimate, corresponding with a second localizer of the first AV, and associating the first location error estimate and the second location error estimate with first location metadata and first environmental metadata corresponding with the first AV. The method can further include steps for determining a location error variance for the first localizer, based on the first location metadata, and the first environmental metadata. Systems and machine-readable media are also provided.
Inventors
- SHAHRAM REZAEI
Assignees
- GM CRUISE HOLDINGS LLC
Dates
- Publication Date
- 20260512
- Application Date
- 20220829
Claims (20)
- 1 . A system for determining a localization error, comprising: at least one memory; and at least one processor coupled to the at least one memory, the at least one processor configured to: receive a first location error estimate, corresponding with a first localizer of a first autonomous vehicle (AV), the first localizer of the first AV comprising a Light Detection and Ranging (LiDAR) localizer; receive a second location error estimate, corresponding with a second localizer of the first AV, the second localizer of the first AV comprising a Global Navigation Satellite System (GNSS) localizer; associate the first location error estimate and the second location error estimate with first location metadata and first environmental metadata corresponding with the first AV, the first environmental metadata comprising a first measure of scene complexity based on a first amount of motion of moving entities detected by the first AV and an indication of a presence of foliage having diffuse reflection properties that degrade accuracy of the first localizer of the first AV; receive a third location error estimate, corresponding with a first localizer of a second AV, the first localizer of the second AV comprising a LiDAR localizer; receive a fourth location error estimate, corresponding with a second localizer of the second AV, the second localizer of the second AV comprising a GNSS localizer; associate the third location error estimate and the fourth location error estimate with second location metadata and second environmental metadata associated with the second AV, the second environmental metadata comprising a second measure of scene complexity based on a second amount of motion of moving entities detected by the second AV and an indication of a presence of foliage having diffuse reflection properties that degrade accuracy of the first localizer of the second AV; determine a location error variance for each of the first localizer and the second localizer of the first AV, based on the first location metadata, the first environmental metadata, the second location metadata, and the second environmental metadata; and transmit the location error variance to the first AV, the location error variance, when received by the first AV, is configured to cause the first AV to update an uncertainty parameter of the first localizer or the second localizer of the first AV during a subsequent localization instance when the first AV determines that a similarity score between third environmental metadata corresponding with the first AV and the first environmental metadata exceeds a predetermined threshold, wherein the updated uncertainty parameter is used by a sensor fusion process of the first AV to determine a relative weighting of inputs from the first localizer or the second localizer when generating an aggregated location estimate for the first AV.
- 2 . The system of claim 1 , wherein the first location metadata indicates a location of the first AV at a time of measurement for the first location error estimate and the second location error estimate.
- 3 . The system of claim 2 , wherein the second location metadata indicates a location of the second AV at a time of measurement for the third location error estimate and the fourth location error estimate.
- 4 . The system of claim 3 , wherein the location of the first AV and the location of the second AV are part of a common location classification based on the first environmental metadata and the second environmental metadata.
- 5 . The system of claim 3 , wherein the location of the first AV and the location of the second AV are different.
- 6 . The system of claim 1 , wherein the first environmental metadata further comprises information indicating environmental conditions around the first AV, and the second environmental metadata further comprises information indicating environmental conditions around the second AV.
- 7 . The system of claim 1 , wherein the first localizer of the first AV comprises a combination of the LiDAR localizer, a GNSS localizer, and an Inertial Measurement Unit (IMU) localizer.
- 8 . The system of claim 1 , wherein the second localizer of the first AV comprises a combination of the GNSS localizer, an Inertial Measurement Unit (IMU) localizer, and a LiDAR localizer.
- 9 . The system of claim 1 , wherein the first localizer of the first AV is different from the second localizer of the first AV.
- 10 . The system of claim 1 , wherein the first localizer of the first AV is the same as the first localizer of the second AV.
- 11 . A computer-implemented method for determining a localization error, comprising: receiving a first location error estimate, corresponding with a first localizer of a first autonomous vehicle (AV), the first localizer of the first AV comprising a Light Detection and Ranging (LiDAR) localizer; receiving a second location error estimate, corresponding with a second localizer of the first AV, the second localizer of the first AV comprising a Global Navigation Satellite System (GNSS) localizer; associating the first location error estimate and the second location error estimate with first location metadata and first environmental metadata corresponding with the first AV, the first environmental metadata comprising a first measure of scene complexity based on a first amount of motion of moving entities detected by the first AV and an indication of a presence of foliage having diffuse reflection properties that degrade accuracy of the first localizer of the first AV; receiving a third location error estimate, corresponding with a first localizer of a second AV, the first localizer of the second AV comprising a LiDAR localizer; receiving a fourth location error estimate, corresponding with a second localizer of the second AV, the second localizer of the second AV comprising a GNSS localizer; associating the third location error estimate and the fourth location error estimate with second location metadata and second environmental metadata associated with the second AV, the second environmental metadata comprising a second measure of scene complexity based on a second amount of motion of moving entities detected by the second AV and an indication of a presence of foliage having diffuse reflection properties that degrade accuracy of the first localizer of the second AV; determining a location error variance for each of the first localizer and the second localizer of the first AV, based on the first location metadata, the first environmental metadata, the second location metadata, and the second environmental metadata; and transmitting the location error variance to the first AV, the location error variance, when received by the first AV, is configured to cause the first AV to update an uncertainty parameter of the first localizer or the second localizer of the first AV during a subsequent localization instance when the first AV determines that a similarity score between third environmental metadata corresponding with the first AV and the first environmental metadata exceeds a predetermined threshold, wherein the updated uncertainty parameter is used by a sensor fusion process of the first AV to determine a relative weighting of inputs from the first localizer or the second localizer when generating an aggregated location estimate for the first AV.
- 12 . The computer-implemented method of claim 11 , wherein the first location metadata indicates a location of the first AV at a time of measurement for the first location error estimate and the second location error estimate.
- 13 . The computer-implemented method of claim 12 , wherein the second location metadata indicates a location of the second AV at a time of measurement for the third location error estimate and the fourth location error estimate.
- 14 . The computer-implemented method of claim 13 , wherein the location of the first AV and the location of the second AV are part of a common location classification based on the first environmental metadata and the second environmental metadata.
- 15 . The computer-implemented method of claim 13 , wherein the location of the first AV and the location of the second AV are different.
- 16 . The computer-implemented method of claim 11 , wherein the first environmental metadata further comprises information indicating environmental conditions around the first AV, and the second environmental metadata further comprises information indicating environmental conditions around the second AV.
- 17 . The computer-implemented method of claim 11 , wherein the first localizer of the first AV comprises a combination of the LiDAR localizer, a GNSS localizer, and an Inertial Measurement Unit (IMU) localizer.
- 18 . The computer-implemented method of claim 11 , wherein the second localizer of the first AV comprises a combination of the GNSS localizer, an Inertial Measurement Unit (IMU) localizer, and a LiDAR localizer.
- 19 . The computer-implemented method of claim 11 , wherein the first localizer of the first AV is different from the second localizer of the first AV.
- 20 . A non-transitory computer-readable storage medium comprising at least one instruction for causing a computer or processor to: receive a first location error estimate, corresponding with a first localizer of a first autonomous vehicle (AV), the first localizer of the first AV comprising a Light Detection and Ranging (LiDAR) localizer; receive a second location error estimate, corresponding with a second localizer of the first AV, the second localizer of the first AV comprising a Global Navigation Satellite System (GNSS) localizer; associate the first location error estimate and the second location error estimate with first location metadata and first environmental metadata corresponding with the first AV, the first environmental metadata comprising a first measure of scene complexity based on a first amount of motion of moving entities detected by the first AV and an indication of a presence of foliage having diffuse reflection properties that degrade accuracy of the first localizer of the first AV; receive a third location error estimate, corresponding with a first localizer of a second AV, the first localizer of the second AV comprising a LiDAR localizer; receive a fourth location error estimate, corresponding with a second localizer of the second AV, the second localizer of the second AV comprising a GNSS localizer; associate the third location error estimate and the fourth location error estimate with second location metadata and second environmental metadata associated with the second AV, the second environmental metadata comprising a second measure of scene complexity based on a second amount of motion of moving entities detected by the second AV and an indication of a presence of foliage having diffuse reflection properties that degrade accuracy of the first localizer of the second AV; determine a location error variance for each of the first localizer and the second localizer of the first AV, based on the first location metadata, the first environmental metadata, the second location metadata, and the second environmental metadata; and transmit the location error variance to the first AV, the location error variance, when received by the first AV, is configured to cause the first AV to update an uncertainty parameter of the first localizer or the second localizer of the first AV during a subsequent localization instance when the first AV determines that a similarity score between third environmental metadata corresponding with the first AV and the first environmental metadata exceeds a predetermined threshold, wherein the updated uncertainty parameter is used by a sensor fusion process of the first AV to determine a relative weighting of inputs from the first localizer or the second localizer when generating an aggregated location estimate for the first AV.
Description
BACKGROUND 1. Technical Field The disclosed technology provides solutions for improving the accuracy of localization estimates and in particular, provides methods for improving localization accuracy by improving location error estimates associated with individual localizers. 2. Introduction Autonomous vehicles (AVs) are vehicles having computers and control systems that perform driving and navigation tasks conventionally performed by a human driver. As AV technologies continue to advance, they will be increasingly used to improve transportation efficiency and safety. As such, AVs will need to perform many of the functions that are conventionally performed by human drivers, such as performing navigation and routing tasks necessary to provide a safe and efficient transportation. Such tasks may require the collection and processing of large quantities of data using various sensor types, including but not limited to cameras and/or Light Detection and Ranging (LiDAR) sensors disposed on the AV. In some instances, the collected data can be used by the AV to perform tasks relating to routing, planning and obstacle avoidance. Performance of such tasks relies on accurate localization measurements of the AV, as well as various objects in the AV's environment. BRIEF DESCRIPTION OF THE DRAWINGS Certain features of the subject technology are set forth in the appended claims. However, the accompanying drawings, which are included to provide further understanding, illustrate disclosed aspects and together with the description explain the principles of the subject technology. In the drawings: FIGS. 1A and 1B conceptually illustrate a context in which new location error estimates may be computed for each of a variety of AV localizers, according to some aspects of the disclosed technology. FIG. 2 illustrates an example system for aggregating localization uncertainty estimates, for example, from a multitude of localizers, according to some aspects of the disclosed technology. FIG. 3 illustrates a flow diagram of an example process for improving localization error estimates for a multitude of localizer types, according to some aspects of the disclosed technology. FIG. 4 illustrates an example system environment that can be used to facilitate AV dispatch and operations, according to some aspects of the disclosed technology. FIG. 5 illustrates an example processor-based system with which some aspects of the subject technology can be implemented. DETAILED DESCRIPTION The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology can be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a more thorough understanding of the subject technology. However, it will be clear and apparent that the subject technology is not limited to the specific details set forth herein and may be practiced without these details. In some instances, structures and components are shown in block diagram form to avoid obscuring certain concepts. As described herein, one aspect of the present technology is the gathering and use of data available from various sources to improve quality and experience. The present disclosure contemplates that in some instances, this gathered data may include personal information. The present disclosure contemplates that the entities involved with such personal information respect and value privacy policies and practices. Perception systems of autonomous vehicles (AVs) are designed to detect objects in the surrounding environment to execute effective navigation and planning operations. To facilitate navigation and routing decisions, such systems rely on an accurate understanding of AV location and pose with respect to the surrounding environment. Some AV localization systems utilize location measurements derived using multiple localization sub-systems (e.g., individual localizers) and their supporting algorithms. As used herein, a localizer can refer to any system/s or device/s (including hardware and/or software) that can be used to make location estimates. By way of example, an AV localization system may utilize multiple localizers including but not limited to one or more: Global Navigation Satellite System (GNSS), Inertial Measurement Unit (IMU), camera-based localizers, Light Detection and Ranging (LiDAR) localizers, Vehicle-to-Everything (V2X) chipsets, radar-based localizers, or a combination thereof (e.g., a lane-matching localizer, etc.). To determine vehicle position, location and corresponding location uncertainty estimates for each localizer can be provided to a sensor fusion process to then determine or estimate a most likely location and location error for the vehicle. Depending on the desired implementation, a sensor fusion process can