Search

CN-122029450-A - Dynamic occupancy grid fusion with diversified inputs

CN122029450ACN 122029450 ACN122029450 ACN 122029450ACN-122029450-A

Abstract

A method for determining a dynamic occupancy grid includes obtaining radar measurement data from at least one radar sensor of a device, obtaining camera-derived data based on at least one image obtained by at least one camera of the device, and determining the dynamic occupancy grid based on the radar measurement data and the camera-derived data.

Inventors

  • V. Apayadanabalan
  • A.K. Sadik
  • A. Josh
  • J. Popravsky
  • M.P. Johnson Wilson

Assignees

  • 高通股份有限公司

Dates

Publication Date
20260512
Application Date
20240906
Priority Date
20240905

Claims (20)

  1. 1. An apparatus, the apparatus comprising: At least one radar sensor; At least one camera; At least one memory, and At least one processor communicatively coupled to the at least one memory, the at least one radar sensor, and the at least one camera, and configured to: obtaining radar measurement data from the at least one radar sensor; Obtaining camera-derived data based on at least one image obtained by the at least one camera, and A dynamic occupancy grid is determined based on the radar measurement data and the camera-derived data.
  2. 2. The device of claim 1, wherein the camera-derived data comprises (1) free-space data, or (2) optical flow data, or (3) occupancy flow data, or (4) depth data, or (5) ground-hazard data, or a combination of two or more of (1) to (5).
  3. 3. The apparatus of claim 1, wherein the apparatus further comprises at least one lidar sensor communicatively coupled to the at least one processor, and wherein the at least one processor is configured to determine the dynamic occupancy grid based on lidar data obtained from the at least one lidar sensor.
  4. 4. The apparatus of claim 1, wherein the at least one processor is configured to determine the dynamic occupancy grid based on high-definition map data stored in the at least one memory.
  5. 5. The apparatus of claim 1, wherein the at least one processor is configured to: determining a radar measurement grid based on the radar measurement data; Determining at least one camera measurement grid based on the camera derived data; Determining a fusion measurement grid based on the radar measurement grid and the at least one camera measurement grid, and The dynamic occupancy grid is determined based on the fused measurement grid.
  6. 6. The apparatus of claim 5, wherein the at least one processor is configured to determine a prediction grid, and to determine the fusion measurement grid further based on the prediction grid.
  7. 7. The apparatus of claim 6, wherein the at least one processor is configured to determine the dynamic occupancy grid further based on a positioning input comprising positioning information of the apparatus.
  8. 8. A method for determining a dynamic occupancy grid, the method comprising: Obtaining radar measurement data from at least one radar sensor of the device; obtaining camera derived data based on at least one image obtained by at least one camera of the device, and The dynamic occupancy grid is determined based on the radar measurement data and the camera-derived data.
  9. 9. The method of claim 8, wherein the camera-derived data comprises (1) free-space data, or (2) optical flow data, or (3) occupancy flow data, or (4) depth data, or (5) ground-hazard data, or a combination of two or more of (1) to (5).
  10. 10. The method of claim 8, further comprising obtaining lidar data by at least one lidar sensor of the device, wherein determining the dynamic occupancy grid is further based on the lidar data.
  11. 11. The method of claim 8, wherein determining the dynamic occupancy grid is further based on high definition map data.
  12. 12. The method of claim 8, the method further comprising: determining a radar measurement grid based on the radar measurement data; determining at least one camera measurement grid based on the camera derived data, and Determining a fusion measurement grid based on the radar measurement grid and the at least one camera measurement grid; Wherein determining the dynamic occupancy grid is based on the fused measurement grid.
  13. 13. The method of claim 12, further comprising determining a prediction grid, wherein determining the fused measurement grid is further based on the prediction grid.
  14. 14. The method of claim 13, wherein determining the dynamic occupancy grid is further based on a positioning input comprising positioning information of the device.
  15. 15. An apparatus, the apparatus comprising: means for obtaining radar measurement data from at least one radar sensor; Means for obtaining camera derived data based on at least one image obtained by at least one camera of the device, and Means for determining a dynamic occupancy grid based on the radar measurement data and the camera-derived data.
  16. 16. The device of claim 15, wherein the camera-derived data comprises (1) free-space data, or (2) optical flow data, or (3) occupancy flow data, or (4) depth data, or (5) ground-hazard data, or a combination of two or more of (1) to (5).
  17. 17. The apparatus of claim 15, further comprising means for obtaining lidar data, wherein the means for determining the dynamic occupancy grid comprises means for determining the dynamic occupancy grid based on the lidar data.
  18. 18. The apparatus of claim 15, wherein the means for determining the dynamic occupancy grid comprises means for determining the dynamic occupancy grid based on high-definition map data.
  19. 19. The apparatus of claim 15, the apparatus further comprising: Means for determining a radar measurement grid based on the radar measurement data; means for determining at least one camera measurement grid based on the camera derived data, and Means for determining a fused measurement grid based on the radar measurement grid and the at least one camera measurement grid; wherein the means for determining the dynamic occupancy grid comprises means for determining the dynamic occupancy grid based on the fused measurement grid.
  20. 20. The apparatus of claim 19, further comprising means for determining a prediction grid, wherein the means for determining the fusion measurement grid comprises means for determining the fusion measurement grid based on the prediction grid.

Description

Dynamic occupancy grid fusion with diversified inputs Cross Reference to Related Applications The present application claims the benefit of U.S. patent application Ser. No. 18/825,534, entitled "DYNAMIC OCCUPANCY GRID FUSION WITH DIVERSE INPUTS (dynamic occupancy grid fusion with diversified INPUTS)" filed on day 9 of 2024, which claims the benefit of U.S. provisional application Ser. No. 63/590,944, entitled "DYNAMIC OCCUPANCY GRID FUSION WITH DIVERSE INPUTS (dynamic occupancy grid fusion with diversified INPUTS)" filed on day 17 of 2023, which are assigned to the assignee of the present application and are hereby incorporated by reference in their entirety for all purposes. Background As the industry tends to deploy more and more sophisticated self-driven technologies, which are capable of operating the vehicle with little or no human input, and are therefore semi-autonomous or autonomous, vehicles are becoming more intelligent. Autonomous and semi-autonomous vehicles may be able to detect information about their location and surrounding environment (e.g., using ultrasound, radar, laser radar, SPS (satellite positioning system), and/or odometry, and/or one or more sensors such as accelerometers, cameras, etc.). Autonomous and semi-autonomous vehicles typically include a control system to interpret information about the environment in which the vehicle is located to identify hazards and determine the navigation path to follow. The driver assistance system may mitigate driving risk for a driver of a self-vehicle (i.e., a vehicle configured to perceive the environment of the vehicle) and/or other road users. The driver assistance system may include one or more active devices and/or one or more passive devices that may be used to determine the environment of the self-vehicle and, for semi-autonomous vehicles, may inform the driver of situations that the driver may be able to address. The driver assistance system may be configured to control various aspects of driving safety and/or driver monitoring. For example, the driver assistance system may control the speed of the self-vehicle to maintain at least a desired separation (in distance or time) between the self-vehicle and another vehicle (e.g., as part of an active cruise control system). The driver assistance system may monitor the surroundings of the self-vehicle, for example, to maintain situational awareness of the self-vehicle. This situational awareness may be used to inform the driver of a problem, e.g., another vehicle is at a blind spot of the driver, another vehicle is located in a collision path with the own vehicle, etc. The situational awareness may include information about the self-vehicle (e.g., speed, location, heading) and/or information about other vehicles or objects (e.g., location, speed, heading, size, object type, etc.). The status of the self-vehicle may be used as an input to a plurality of driver assistance functionalities, such as Advanced Driver Assistance Systems (ADAS). Downstream driving assistance, such as ADAS, may be safety critical and/or may give the driver information to the vehicle and/or control the vehicle in some way. Disclosure of Invention An example apparatus includes at least one radar sensor, at least one camera, at least one memory, and at least one processor communicatively coupled to the at least one memory, the at least one radar sensor, and the at least one camera, and configured to obtain radar measurement data from the at least one radar sensor, obtain camera derived data based on at least one image obtained by the at least one camera, and determine a dynamic occupancy grid based on the radar measurement data and the camera derived data. An example method for determining a dynamic occupancy grid includes obtaining radar measurement data from at least one radar sensor of an apparatus, obtaining camera-derived data based on at least one image obtained by at least one camera of the apparatus, and determining a dynamic occupancy grid based on the radar measurement data and the camera-derived data. Another example apparatus includes means for obtaining radar measurement data from at least one radar sensor, means for obtaining camera-derived data based on at least one image obtained by at least one camera of the apparatus, and means for determining a dynamic occupancy grid based on the radar measurement data and the camera-derived data. An example non-transitory processor-readable storage medium includes processor-readable instructions for causing at least one processor of an apparatus to obtain radar measurement data from at least one radar sensor of the apparatus, obtain camera-derived data based on at least one image obtained by at least one camera of the apparatus, and determine a dynamic occupancy grid based on the radar measurement data and the camera-derived data. Drawings FIG. 1 is a top view of an example self-vehicle. FIG. 2 is a block diagram of components of an example device that the self-vehicle s