Search

US-12617393-B2 - System for detecting moving objects

US12617393B2US 12617393 B2US12617393 B2US 12617393B2US-12617393-B2

Abstract

A computer that includes a processor and a memory, the memory including instructions executable by the processor to determine a location and a trajectory of a moving object with respect to a stationary vehicle based on one or more first stationary vehicle sensors. A threat level can be determined based on the location and the trajectory of the moving object, and a control barrier function. Based on the threat level, the stationary vehicle can be operated to move the stationary vehicle to reduce a probability of impact between the stationary vehicle and the moving object.

Inventors

  • SMRUTI RANJAN PANIGRAHI
  • Erol Dogan Sumer
  • Ehsan Arabi

Assignees

  • FORD GLOBAL TECHNOLOGIES, LLC

Dates

Publication Date
20260505
Application Date
20231204

Claims (20)

  1. 1 . A system, comprising: a computer that includes a processor and a memory, the memory including instructions executable by the processor to: determine a location and a trajectory of a moving object with respect to a stationary vehicle based on one or more first stationary vehicle sensors; determine a threat level based on the location and the trajectory of the moving object, and a control barrier function based on curvature of a control barrier line, a lateral distance of the moving object to the stationary vehicle perpendicular to a direction of travel of the moving object, and a longitudinal distance to the stationary vehicle parallel to the direction of travel of the moving object; and based on the threat level, operate the stationary vehicle to move the stationary vehicle to reduce a probability of impact between the stationary vehicle and the moving object.
  2. 2 . The system of claim 1 , the instructions including further instructions to determine the probability of impact based on the control barrier function, the location of the moving object and the trajectory of the moving object.
  3. 3 . The system of claim 1 , wherein the threat level includes an idle state, an aware state, a warning state, and an evade state, wherein the idle state includes no probability of impact, the aware state includes a low probability of impact, the warning state includes a moderate probability of impact and the evade state includes a high probability of impact.
  4. 4 . The system of claim 1 , wherein the stationary vehicle is one or more of parked, in a key-off state, and unoccupied.
  5. 5 . The system of claim 1 , the instructions including further instructions to operate the stationary vehicle by controlling one or more of vehicle propulsion, vehicle steering, and vehicle brakes.
  6. 6 . The system of claim 1 , the instructions including further instructions to acquire first stationary vehicle sensor data periodically and combine the first stationary vehicle sensor data into a virtual sensor grid.
  7. 7 . The system of claim 1 , the instructions including further instructions to determine the trajectory of the moving object using a Kalman filter based on determining two or more locations of the moving object.
  8. 8 . The system of claim 1 , the instructions including further instructions to activate second stationary vehicle sensors based on the threat level.
  9. 9 . The system of claim 1 , the instructions including further instructions to predict a location of impact on the stationary vehicle based on the control barrier function.
  10. 10 . The system of claim 1 , the instructions including further instructions to activate one or more lights included in the stationary vehicle based on the threat level.
  11. 11 . The system of claim 1 , the instructions including further instructions to perform one or more of flashing a vehicle light and sounding a vehicle horn based on the threat level.
  12. 12 . The system of claim 1 , the instructions including further instructions to determine an evasive distance around the stationary vehicle within which to operate the stationary vehicle to reduce the probability of impact without increasing the probability of impacting an object in an environment around the stationary vehicle.
  13. 13 . The system of claim 1 , wherein the moving object is a second vehicle.
  14. 14 . A method comprising: determining a location and a trajectory of a moving object with respect to a stationary vehicle based on one or more first stationary vehicle sensors; determining a threat level based on the location and the trajectory of the moving object, and a control barrier function based on curvature of a control barrier line, a lateral distance of the moving object to the stationary vehicle perpendicular to a direction of travel of the moving object, and a longitudinal distance to the stationary vehicle parallel to the direction of travel of the moving object; and based on the threat level, operating the stationary vehicle to move the stationary vehicle to reduce the probability of impact between the stationary vehicle and the moving object.
  15. 15 . The method of claim 14 , further comprising determining the probability of impact based on the control barrier function, the location of the moving object and the trajectory of the moving object.
  16. 16 . The method of claim 14 , wherein the threat level includes an idle state, an aware state, a warning state, and an evade state, wherein the idle state includes no probability of impact, the aware state includes a low probability of impact, the warning state includes a moderate probability of impact and the evade state includes a high probability of impact.
  17. 17 . The method of claim 14 , wherein the stationary vehicle is one or more of parked, in a key-off state, and unoccupied.
  18. 18 . The method of claim 14 , further comprising operating the stationary vehicle by controlling one or more of vehicle propulsion, vehicle steering, and vehicle brakes.
  19. 19 . The method of claim 14 , further comprising acquiring first stationary vehicle sensor data periodically and combining the first stationary vehicle sensor data into a virtual sensor grid.
  20. 20 . The method of claim 14 , further comprising determining the trajectory of the moving object using a Kalman filter based on determining two or more locations of the moving object.

Description

BACKGROUND Computers can operate systems and/or devices including vehicles, robots, drones, and/or object tracking systems. Data including images can be acquired by sensors and processed by a computer to determine a location of a system with respect to objects in an environment around the system. A computer may use the location data to determine one or more trajectories of objects and/or the system or components thereof in the environment. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram of an example vehicle system. FIG. 2 is a diagram of an example vehicle including sensors. FIG. 3 is a diagram of an example vehicle virtual sensing grid. FIG. 4 is a diagram of an example vehicle virtual sensing grid including objects. FIG. 5 is a diagram of an example traffic scene. FIG. 6 is a diagram of another example traffic scene. FIG. 7 is a diagram of a further example traffic scene. FIG. 8 is a diagram of an example vehicle impact detection system. FIG. 9 is a flowchart diagram of example vehicle impact detection. DETAILED DESCRIPTION Systems including vehicles, robots, drones, etc., can be operated by acquiring sensor data, including data regarding an environment around the system, and processing the sensor data to determine locations of objects in the environment around the system. The determined location data could be processed to determine operation of the system or portions of the system. For example, a robot could determine the location of another nearby robot's arm. The determined robot arm location could be used by the robot to determine a path upon which to move a gripper to grasp a workpiece while decreasing the probability of encountering the other robot's arm. In another example, a vehicle could determine a location of another vehicle traveling on a roadway. The vehicle could use the determined location of the other vehicle to determine a path upon which to operate while planning to maintain a predetermined distance from the other vehicle. Vehicle operation will be used as a non-limiting example of system location determination in description below. In examples herein, a vehicle can be operated using light impact detection for pre-impact and post-impact vehicle control. Light impact detection herein means detection of a moving object in an environment around a vehicle, including a location and velocity of the moving object, and a determination that the moving object will likely impact the vehicle, and that the impact will likely occur at a relatively low speed. Light impact detection in examples assumes that the moving object is within 1 to 3 meters (m) of the vehicle and is moving at a low speed, e.g., a maximum of 1 to 3 meters/second (m/s). Techniques described herein enhance light impact detection typically by performing light impact detection while a vehicle is parked and/or unoccupied. Light impact detection described herein detects moving objects in a virtual sensing grid around a parked and/or unoccupied vehicle. The vehicle can then be operated to take evasive action to reduce a probability of impact. In examples where a vehicle cannot be operated to reduce the probability of impact, vehicle sensors can be operated to record data before, during and after the impact, including images of the moving object. The recorded data can provide a record of the impact, e.g., can be uploaded to a server computer or the like to inform users that an impact has occurred. A virtual sensing grid herein means a radial pattern extending out from the vehicle upon which location and velocity data from moving objects can be placed. Virtual sensing refers to the light impact sensing system combining location and velocity data from two or more sensor types or modalities into a single data point on the virtual sensing grid. For example, a vehicle can include one or more optical sensors such as video cameras, one or more ultrasonic sensors, and/or one or more short range radar sensors. Respective ones of these sensors, following acquisition and processing by a computer included in the vehicle, can produce location and velocity data regarding moving objects. Combining data from two or more sensors can enhance accuracy, resolution, and reliability of acquired moving object data. A computer 115 as described herein can determine a threat level by comparing a moving object's location and trajectory to a control barrier function which determines distances around a stationary vehicle. Threat levels in the context of this description correspond to respective probabilities or ranges of probabilities of an impact, and in one example can include “idle,” which includes no probability of impact, “aware,” which include a low probability of impact (e.g., less than 10% probability), “warning,” which includes a moderate probability of impact (e.g., between 10% and 90%), and “evade” which includes a high probability of impact (e.g., greater than 90%) between the stationary vehicle and a moving object. Determining threat levels