US-12623683-B2 - Methods and systems for measuring sensor visibility
Abstract
Provided are methods for methods and systems for measuring sensor visibility, which can include obtaining sensor data associated with an autonomous vehicle and determining a blockage parameter indicative of a blockage of a sensor based on a comparison of the sensor data with secondary data. Some methods described also include controlling an operation of an autonomous vehicle based on the blockage parameter. Systems and computer program products are also provided.
Inventors
- Timothy O'Donnell
Assignees
- MOTIONAL AD LLC
Dates
- Publication Date
- 20260512
- Application Date
- 20230126
Claims (18)
- 1 . A method, comprising: obtaining, using at least one processor, first sensor data from a first sensor associated with a first autonomous vehicle, wherein the first sensor data is indicative of an environment in which the first autonomous vehicle is operating; obtaining, using the at least one processor, environment data indicative of the environment; determining, using the at least one processor, based on a comparison of the first sensor data and the environment data, a first blockage parameter indicative of a first blockage of the first sensor; obtaining, using the at least one processor, a second blockage parameter from a second autonomous vehicle operating in the environment, wherein the second blockage parameter indicates a second blockage of a second sensor associated with the second autonomous vehicle, wherein the second autonomous vehicle determines the second blockage parameter; determining, using the at least one processor, an adverse weather condition in the environment based on the first blockage parameter and the second blockage parameter; controlling, based on the determined adverse weather condition, an operation of the first autonomous vehicle; and communicating the determined adverse weather condition to a fleet management system, wherein the fleet management system is configured to updates driving parameters for a fleet of autonomous vehicles in the environment based on the determined adverse weather condition.
- 2 . The method of claim 1 , wherein determining the first blockage parameter comprises determining, using the at least one processor, whether the first sensor data satisfies a criterion.
- 3 . The method of claim 2 , wherein determining the first blockage parameter comprises, in response to determining that the first sensor data does not satisfy the criterion, determining, using the at least one processor, the first blockage parameter as indicative of the first sensor being blocked.
- 4 . The method of claim 3 , wherein the criterion is based on an object indicated by the environment data, wherein the first sensor data satisfies the criterion when the first sensor data indicates a presence of the object.
- 5 . The method of claim 1 , wherein the environment data is based on three-dimensional map data and data obtained from a third sensor associated with the first autonomous vehicle.
- 6 . The method of claim 5 , wherein the third sensor is a same type of sensor as the first sensor.
- 7 . The method of claim 5 , wherein the first sensor is a non-radar sensor and the third sensor is a radar sensor.
- 8 . The method of claim 5 , further comprising: determining, by the at least one processor, based on the first sensor data and the environment data, an overlapping field-of-vision parameter indicative of a maximum overlapping field-of-vision of the first sensor and the third sensor.
- 9 . The method of claim 8 , wherein determining the first blockage parameter comprises determining, using the at least one processor, whether the first sensor data satisfies a criterion, wherein the criterion is based on the environment data and the overlapping field-of-vision parameter.
- 10 . The method of claim 1 , the method comprising obtaining, using the at least one processor, location data indicative of a location of the first autonomous vehicle.
- 11 . The method of claim 10 , wherein the comparison of the first sensor data and the environment data comprises a comparison of the first sensor data and a localized environment data, wherein the localized environment data is obtained based on the environment data and the location data.
- 12 . The method of claim 1 , wherein the first sensor is selected from a group consisting of a radar sensor, a camera sensor, and a LIDAR sensor.
- 13 . The method of claim 1 , wherein the operation comprises one or more of a speed, an acceleration, and a direction of the first autonomous vehicle.
- 14 . The method of claim 1 , wherein the fleet of autonomous vehicles comprises a plurality of autonomous vehicles, wherein to update the driving parameters the fleet management system is configured to update an operating route to avoid the environment.
- 15 . The method of claim 1 , further comprising decelerating the first autonomous vehicle.
- 16 . A non-transitory computer readable medium comprising instructions stored thereon that, when executed by at least one processor, cause the at least one processor to carry out operations comprising: obtaining first sensor data from a first sensor associated with a first autonomous vehicle, wherein the first sensor data is indicative of an environment in which the first autonomous vehicle is operating; obtaining environment data indicative of the environment; determining based on a comparison of the first sensor data and the environment data, a first blockage parameter indicative of a first blockage of the first sensor; obtaining, using the at least one processor, a second blockage parameter from a second autonomous vehicle operating in the environment, wherein the second blockage parameter indicates a second blockage of a second sensor associated with the second autonomous vehicle, wherein the second autonomous vehicle determines the second blockage parameter; determining an adverse weather condition in the environment based on the first blockage parameter and the second blockage parameter; controlling, based on the determined adverse weather condition, an operation of the first autonomous vehicle; and communicating the determined adverse weather condition to a fleet management system, wherein the fleet management system is configured to update driving parameters for a fleet of autonomous vehicles in the environment based on the determined adverse weather condition.
- 17 . A system, comprising at least one processor; and at least one memory storing instructions thereon that, when executed by the at least one processor, cause the at least one processor to: obtain first sensor data from a first sensor associated with a first autonomous vehicle, wherein the first sensor data is indicative of an environment in which the first autonomous vehicle is operating; obtain environment data indicative of the environment; determine, based on a comparison of the first sensor data and the environment data, a first blockage parameter indicative of a first blockage of the first sensor; obtain a second blockage parameter from a second autonomous vehicle operating in the environment, wherein the second blockage parameter indicates a second blockage of a second sensor associated with the second autonomous vehicle, wherein the second autonomous vehicle determines the second blockage parameter; determine an adverse weather condition in the environment based on the first blockage parameter and the second blockage parameter; control, based on the determined adverse weather condition, an operation of the first autonomous vehicle; and communicate the determined adverse weather condition to a fleet management system, wherein the fleet management system is configured to update driving parameters for a fleet of autonomous vehicles in the environment based on the determined adverse weather condition.
- 18 . The system of claim 17 , wherein the environment data includes three-dimensional map data and data obtained from a third sensor associated with the first autonomous vehicle.
Description
BACKGROUND Autonomous vehicles (AVs) include various types of sensors including lidar, radar, cameras, infrared, microphones, and other sensors. However, autonomous vehicles can have limited operational domains due to sensor visibility limitations, such as in inclement weather. Many autonomous vehicle sensors are affected by precipitation and other weather factors. Rain, snow, sleet, fog, dust, mist, hail, smoke, and other obscurants can cause reduced sensor visibility, reduced sensor range, reduced sensitivity and also create false positives. Autonomous vehicle fleet operators weigh the capability of the vehicles against the needs of the likelihood of adverse weather conditions. Fleet operators currently do not have a method of real time measurement of sensor visibility performance in actual weather conditions. They currently are only able to rely on weather reports from traditional weather stations, the government, and local media. There is no formal standard for measuring autonomous vehicle sensor visibility. Further, it can be difficult to determine when the sensors have degraded performance due to weather, lens cleanliness, or sensor damage. BRIEF DESCRIPTION OF THE FIGURES FIG. 1 is an example environment in which a vehicle including one or more components of an autonomous system can be implemented; FIG. 2 is a diagram of one or more systems of a vehicle including an autonomous system; FIG. 3 is a diagram of components of one or more devices and/or one or more systems of FIGS. 1 and 2; FIG. 4 is a diagram of certain components of an autonomous system; FIG. 5 is a diagram of an implementation of a process for methods and systems for measuring sensor visibility; and FIGS. 6A-6B are diagrams of an implementation of a process for methods and systems for measuring sensor visibility; and FIG. 7 is a flowchart of a process for methods and systems for measuring sensor visibility. DETAILED DESCRIPTION In the following description numerous specific details are set forth in order to provide a thorough understanding of the present disclosure for the purposes of explanation. It will be apparent, however, that the embodiments described by the present disclosure can be practiced without these specific details. In some instances, well-known structures and devices are illustrated in block diagram form in order to avoid unnecessarily obscuring aspects of the present disclosure. Specific arrangements or orderings of schematic elements, such as those representing systems, devices, modules, instruction blocks, data elements, and/or the like are illustrated in the drawings for ease of description. However, it will be understood by those skilled in the art that the specific ordering or arrangement of the schematic elements in the drawings is not meant to imply that a particular order or sequence of processing, or separation of processes, is required unless explicitly described as such. Further, the inclusion of a schematic element in a drawing is not meant to imply that such element is required in all embodiments or that the features represented by such element may not be included in or combined with other elements in some embodiments unless explicitly described as such. Further, where connecting elements such as solid or dashed lines or arrows are used in the drawings to illustrate a connection, relationship, or association between or among two or more other schematic elements, the absence of any such connecting elements is not meant to imply that no connection, relationship, or association can exist. In other words, some connections, relationships, or associations between elements are not illustrated in the drawings so as not to obscure the disclosure. In addition, for ease of illustration, a single connecting element can be used to represent multiple connections, relationships or associations between elements. For example, where a connecting element represents communication of signals, data, or instructions (e.g., “software instructions”), it should be understood by those skilled in the art that such element can represent one or multiple signal paths (e.g., a bus), as may be needed, to affect the communication. Although the terms first, second, third, and/or the like are used to describe various elements, these elements should not be limited by these terms. The terms first, second, third, and/or the like are used only to distinguish one element from another. For example, a first contact could be termed a second contact and, similarly, a second contact could be termed a first contact without departing from the scope of the described embodiments. The first contact and the second contact are both contacts, but they are not the same contact. The terminology used in the description of the various described embodiments herein is included for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description of the various described embodiments and the appended claims, the singu