US-20260129148-A1 - Real-Time Adjustment Of Vehicle Sensor Field Of View Volume
Abstract
Disclosed are systems and methods that can be used for adjusting the field of view of one or more sensors of an autonomous vehicle. In the systems and methods, each sensor of the one or more sensors is configured to operate in accordance with a field of view volume up to a maximum field of view volume. The systems and methods include determining an operating environment of an autonomous vehicle. The systems and methods also include based on the determined operating environment of the autonomous vehicle, adjusting a field of view volume of at least one sensor of the one or more sensors from a first field of view volume to an adjusted field of view volume different from the first field of view volume. Additionally, the systems and methods include controlling the autonomous vehicle to operate using the at least one sensor having the adjusted field of view volume.
Inventors
- Simon Verghese
- Alexander McCauley
Assignees
- WAYMO LLC
Dates
- Publication Date
- 20260507
- Application Date
- 20250919
Claims (20)
- 1 . A method comprising: operating a light detection and ranging (LIDAR) system using one or more parameters associated with a clear weather condition; detecting a foggy weather condition, wherein detecting the foggy weather condition comprises detecting, using the LIDAR system, backscattered light from fog droplets; and in response to detecting the foggy weather condition, operating the LIDAR system using one or more parameters associated with the foggy weather condition, wherein the one or more parameters associated with the foggy weather condition differ from the one or more parameters associated with the clear weather condition.
- 2 . The method of claim 1 , wherein the one or more parameters associated with the clear weather condition include a first power level of laser pulses transmitted by the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include a second power level of laser pulses transmitted by the LIDAR system, and wherein the second power level is greater than the first power level.
- 3 . The method of claim 1 , wherein the one or more parameters associated with the clear weather condition include parameters that define a first field of view volume for the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include parameters that define a second field of view volume for the LIDAR system, and wherein the second sensor field of view volume is smaller than the first sensor field of view volume.
- 4 . The method of claim 3 , wherein the parameters that define the first field of view volume for the LIDAR system define a first range, a first azimuth, and a first elevation for the first field of view, and wherein the parameters that define the second field view of volume for the LIDAR system define a second range, a second azimuth, and a second elevation for the second field of view.
- 5 . The method of claim 1 , wherein the one or more parameters associated with the clear weather condition include a first field of view range for the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include a second field of view range for the LIDAR system, and wherein the second field of view range is less than the first field of view range.
- 6 . The method of claim 1 , wherein the one or more parameters associated with the clear weather condition relate to processing sensor data acquired by the LIDAR system, and wherein the one or more parameters associated with the foggy weather condition relate to processing sensor data acquired by the LIDAR system.
- 7 . The method of claim 6 , wherein the one or more parameters associated with the foggy weather condition define a reduced range for the LIDAR system and cause sensor data acquired by the LIDAR system corresponding to distances that exceed the reduced range to be ignored.
- 8 . The method of claim 1 , wherein the LIDAR system is mounted on a vehicle.
- 9 . A vehicle comprising: a light detection and ranging (LIDAR) system; one or more processors; and a memory coupled to the one or more processors and having stored thereon instructions that, upon execution by the one or more processors, cause the one or more processors to perform operations comprising: operating the LIDAR system using one or more parameters associated with a clear weather condition; detecting a foggy weather condition, wherein detecting the foggy weather condition comprises detecting, using the LIDAR system, backscattered light from fog droplets; and in response to detecting the foggy weather condition, operating the LIDAR system using one or more parameters associated with the foggy weather condition, wherein the one or more parameters associated with the foggy weather condition differ from the one or more parameters associated with the clear weather condition.
- 10 . The vehicle of claim 9 , wherein the one or more parameters associated with the clear weather condition include a first power level of laser pulses transmitted by the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include a second power level of laser pulses transmitted by the LIDAR system, and wherein the second power level is greater than the first power level.
- 11 . The vehicle of claim 9 , wherein the one or more parameters associated with the clear weather condition include parameters that define a first field of view volume for the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include parameters that define a second field of view volume for the LIDAR system, and wherein the second sensor field of view volume is smaller than the first sensor field of view volume.
- 12 . The vehicle of claim 11 , wherein the parameters that define the first field of view volume for the LIDAR system define a first range, a first azimuth, and a first elevation for the first field of view, and wherein the parameters that define the second field view of volume for the LIDAR system define a second range, a second azimuth, and a second elevation for the second field of view.
- 13 . The vehicle of claim 9 , wherein the one or more parameters associated with the clear weather condition include a first field of view range for the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include a second field of view range for the LIDAR system, and wherein the second field of view range is less than the first field of view range.
- 14 . The method of claim 9 , wherein the one or more parameters associated with the clear weather condition relate to processing sensor data acquired by the LIDAR system, and wherein the one or more parameters associated with the foggy weather condition relate to processing sensor data acquired by the LIDAR system.
- 15 . The vehicle of claim 14 , wherein the one or more parameters associated with the foggy weather condition define a reduced range for the LIDAR system and cause sensor data acquired by the LIDAR system corresponding to distances that exceed the reduced range to be ignored.
- 16 . A non-transitory computer-readable storage medium, having stored thereon program instructions that, upon execution by one or more processors, cause the one or more processors to perform operations, the operations comprising: operating a light detection and ranging (LIDAR) system using one or more parameters associated with a clear weather condition; detecting a foggy weather condition, wherein detecting the foggy weather condition comprises detecting, using the LIDAR system, backscattered light from fog droplets; and in response to detecting the foggy weather condition, operating the LIDAR system using one or more parameters associated with the foggy weather condition, wherein the one or more parameters associated with the foggy weather condition differ from the one or more parameters associated with the clear weather condition.
- 17 . The non-transitory computer-readable storage medium of claim 16 , wherein the one or more parameters associated with the clear weather condition include a first power level of laser pulses transmitted by the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include a second power level of laser pulses transmitted by the LIDAR system, and wherein the second power level is greater than the first power level.
- 18 . The non-transitory computer-readable storage medium of claim 16 , wherein the one or more parameters associated with the clear weather condition include parameters that define a first field of view volume for the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include parameters that define a second field of view volume for the LIDAR system, and wherein the second sensor field of view volume is smaller than the first sensor field of view volume.
- 19 . The non-transitory computer-readable storage medium of claim 16 , wherein the one or more parameters associated with the clear weather condition include a first field of view range for the LIDAR system, wherein the one or more parameters associated with the foggy weather condition include a second field of view range for the LIDAR system, and wherein the second field of view range is less than the first field of view range.
- 20 . The non-transitory computer-readable storage medium of claim 16 , wherein the one or more parameters associated with the clear weather condition relate to processing sensor data acquired by the LIDAR system, and wherein the one or more parameters associated with the foggy weather condition relate to processing sensor data acquired by the LIDAR system.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS This application is a continuation of U.S. application Ser. No. 17/002,092, filed on Aug. 25, 2025, which claims priority to U.S. Provisional Application No. 62/952,879 filed on Dec. 23, 2019. The foregoing applications are incorporated herein by reference. BACKGROUND Vehicles can be configured to operate in an autonomous mode in which the vehicle navigates through an environment with little or no input from a driver. Such autonomous vehicles can include one or more systems (e.g., sensors and associated computing devices) that are configured to detect information about the environment in which the vehicle operates. The vehicle and its associated computer-implemented controller use the detected information to navigate through the environment. For example, if the system(s) detect that the vehicle is approaching an obstacle, as determined by the computer-implemented controller, the controller adjusts the vehicle's directional controls to cause the vehicle to navigate around the obstacle. For example, an autonomous vehicle may include lasers, sonar, radar, cameras, thermal imagers, and other sensors which scan and/or record data about the surroundings of the vehicle. Sensor data from one or more of these devices may be used to detect objects and their respective characteristics (position, shape, heading, speed, etc.). This detection and identification is useful for the operation of autonomous vehicles. SUMMARY In one example, the present disclosure provides a system. The system includes one or more sensors, each sensor of the one or more sensors being configured to operate in accordance with a field of view volume up, the field of view volume representing a space surrounding the autonomous vehicle within which the sensor is expected to detect objects at a confidence level higher than a predefined confidence threshold. The system also includes one or more processors coupled to the one or more sensors. The system also includes a memory coupled to the one or more processors and having stored thereon instructions that, upon execution by the one or more processors, cause the one or more processors to perform operations. The operations include identifying a plurality of operational design domains (ODDs) for the autonomous vehicle, where each ODD includes at least one of an environmental condition, a geographical condition, a time-of-day condition, a traffic condition, or a roadway condition, and where each ODD is associated with a predetermined field of view volume for at least one of the one or more sensors. The operations also include associating the autonomous vehicle with a first ODD of the plurality of ODDs. The operations also include detecting a change in an operating environment of the autonomous vehicle. The operations also include in response to the detecting, associating the autonomous vehicle with a second ODD of the plurality of ODDs. The operations also include in response to the autonomous vehicle being associated with the second ODD, operating the at least one sensor using the predetermined field of view volume associated with the second ODD. Some examples of the present disclosure provide a method performed by a computing device configured to control operation of an autonomous vehicle. The method includes identifying a plurality of operational design domains (ODDs) for the autonomous vehicle, where each ODD includes at least one of an environmental condition, a geographical condition, a time-of-day condition, a traffic condition, or a roadway condition, and where each ODD is associated with a predetermined field of view volume for at least one of one or more sensors, where each sensor of the one or more sensors is configured to operate in accordance with a field of view volume, the field of view volume representing a space surrounding the autonomous vehicle within which the sensor is expected to detect objects at a confidence level higher than a predefined confidence threshold. The method also includes associating the autonomous vehicle with a first ODD of the plurality of ODDs. The method also includes detecting a change in an operating environment of the autonomous vehicle. The method also includes in response to the detecting, associating the autonomous vehicle with a second ODD of the plurality of ODDs. The method also includes in response to the autonomous vehicle being associated with the second ODD, operating the at least one sensor using the predetermined field of view volume associated with the second ODD. Some examples of the present disclosure provide a non-transitory computer-readable storage medium, having stored thereon program instructions that, upon execution by one or more processors, cause the one or more processors to perform operations. The operations include identifying a plurality of operational design domains (ODDs) for the autonomous vehicle, where each ODD includes at least one of an environmental condition, a geographical condition, a