Search

US-12617412-B2 - System and methods for detecting abnormal following vehicles

US12617412B2US 12617412 B2US12617412 B2US 12617412B2US-12617412-B2

Abstract

Embodiments of the present disclosure a privacy-preserving defensive driving system that can detect abnormal following vehicles during driving. An example system may be configured to: continuously capture video data of the camera's field-of-view, detect following vehicles in the captured video data, and determine whether one or more following vehicles is exhibiting abnormal following behavior with respect to a first vehicle.

Inventors

  • Wei Sun
  • Kannan Srinivasan

Assignees

  • OHIO STATE INNOVATION FOUNDATION

Dates

Publication Date
20260505
Application Date
20240221

Claims (12)

  1. 1 . A system for detecting abnormal following vehicles, the system comprising: at least one sensor configured to detect motion data of a first vehicle along the x-axis, y-axis, and z-axis; a camera positioned within the first vehicle such that a field-of-view (FOV) of the camera faces outward from a rear of the first vehicle; a user interface; a processor; memory having instructions stored thereon that, when executed by the processor, cause the system to: continuously capture video data of the camera's FOV; detect following vehicles in the captured video data using an object detection model; determine an amount of time that each of the following vehicles follows the first vehicle, wherein the amount of time (T ID ) that each of the following vehicles follows the first vehicle is calculated as: T ID = N L - N F f r where f r denotes a frame rate of the camera, N F denotes a time that the respective following vehicle was first detected in the video data, and N L denotes a time that the respective following vehicle is no longer detected in the video data; determine critical driving behavior of the first vehicle based at least in part on a measure of variation in the detected motion data over the y-axis or z-axis, wherein determining the critical driving behavior comprises filtering noise and removing road condition artifacts from the motion data across at least one coordinate system axis; perform a sensor fusion operation to synchronize the video data and the filtered motion data based on detected time variations to determine whether one or more of the following vehicles is exhibiting abnormal following behavior with respect to the first vehicle based on the amount of time that each of the following vehicles follows the first vehicle and the determined critical driving behavior of the first vehicle; and responsive to determining that one or more of the following vehicles is exhibiting abnormal following behavior, generate and output an audio or visual alert or output driving instructions to a designated safe location via the user interface.
  2. 2 . The system of claim 1 , wherein the at least one sensor comprises an inertial measurement unit (IMU).
  3. 3 . The system of claim 1 , wherein the at least one sensor, camera, processor, user interface, and memory are components of a smartphone.
  4. 4 . The system of claim 1 , wherein one or more of the at least one sensor, camera, processor, user interface, and memory are components of a vehicle computer.
  5. 5 . The system of claim 1 , wherein the object detection model is a deep convolution neural network.
  6. 6 . The system of claim 1 , wherein the object detection model is a You Only Look Once (YOLO) algorithm.
  7. 7 . The system of claim 1 , wherein the noise is filtered from the motion data using a Savitzky-Golay filter.
  8. 8 . The system of claim 1 , wherein to determine whether one or more of the following vehicles is exhibiting abnormal following behavior with respect to the first vehicle based on the amount of time that each of the following vehicles follows the first vehicle and the critical driving behavior of the first vehicle, the instructions cause the system to: determine an anomaly score for each of the following vehicles based on the critical driving behavior of the first vehicle within respective amounts of time that each of the following vehicles were following the first vehicle.
  9. 9 . The system of claim 8 , wherein the anomaly scores are determined using a Local Outlier Factor (LOF) algorithm.
  10. 10 . The system of claim 1 , wherein the instructions further cause the system to alert an operator of the first vehicle if one or more of the following vehicles is determined to exhibit abnormal following behavior.
  11. 11 . A method for detecting abnormal following vehicles, the method comprising: obtaining motion data for a first vehicle via at least one sensor along the x-axis, y-axis, and z-axis; continuously capturing video data of a rear FOV of the first vehicle via a second sensor; detecting one or more following vehicles in the captured video data using an object detection model; determining an amount of time that each of the following vehicles follows the first vehicle, wherein the amount of time (T ID ) that each of the following vehicles follows the first vehicle is calculated as: T ID = N L - N F f r where f r denotes a frame rate of the second sensor, N F denotes a time that the respective following vehicle was first detected in the video data, and N L denotes a time that the respective following vehicle is no longer detected in the video data; determining critical driving behavior of the first vehicle based at least in part on a measure of variation in the motion data over the y-axis or z-axis, wherein determining the critical driving behavior comprises filtering noise and removing road condition artifacts from the motion data across at least one coordinate system axis; performing a sensor fusion operation to synchronize the video data and the filtered motion data based on detected time variations to determine whether one or more of the following vehicles is exhibiting abnormal following behavior with respect to the first vehicle based on the amount of time that each of the following vehicles follows the first vehicle and the determined critical driving behavior of the first vehicle; and responsive to determining that one or more of the following vehicles is exhibiting abnormal following behavior, generating and outputting an audio or visual alert or outputting driving instructions to a designated safe location via a user interface.
  12. 12 . The method of claim 11 , wherein the object detection model comprises at least one of a deep convolution neural network or a YOLO algorithm.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS This application claims priority to and the benefit of U.S. Provisional Application No. 63/486,133, titled “SYSTEM AND METHODS FOR DETECTING ABNORMAL FOLLOWING VEHICLES,” filed on Feb. 21, 2023, the content of which is hereby incorporated by reference herein in its entirety. STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT This invention was made with government support under Grant Nos. 2007581, 2128567, and 2112471 awarded by the National Science Foundation. The government has certain rights in the invention. BACKGROUND Recent studies show that Americans spend about one hour behind the wheel of a vehicle every day. Being followed by the other vehicles while driving is not only a scary experience, but risks exposure of sensitive or private information (e.g., home address, work address, daily routines, etc.). Moreover, abnormal following behavior in other vehicles can cause significant traffic issues, such as accidents, delays, and the like, as a following vehicle needs to maintain an appropriate separation to a followed vehicle without getting lost or being detected. To improve the safety and privacy of drivers, it would be beneficial to detect abnormal following behaviors of other vehicles and, moreover, to discriminate between abnormal following vehicles (e.g., “stalking” vehicles) and other, non-threating or “normal” following vehicles. SUMMARY Embodiments of the present disclosure provide an infrastructure-free system that can detect abnormal following vehicles during driving. To uncover or detect the abnormal following vehicles during driving, simply comparing the driving trajectory of the following vehicles and a target vehicle will introduce high false positive and false negative detection. This is because the abnormal following vehicle may not have the same driving trajectory as a target vehicle. The abnormal following vehicles can simply follow the primary vehicle anywhere and in any manner. Furthermore, normal following vehicles may simply have the same driving trajectory as the target vehicle due to the single-track road or same driving destination. Embodiments of the present disclosure provide systems and methods that can detect abnormal following vehicles, for example, using sensor fusion operations. In some implementations, an imaging sensor, such as a camera, is used to extract a following vehicle's following time, and additional sensors (e.g., Inertial Measurement Unit (IMU) sensor(s), Gyroscope) are used to obtain data that can be used to determine a primary vehicle's critical driving behavior (e.g., making a left or right turn). The space diversity of IMU sensing data can be used to remove road surface condition artifacts (e.g., bumps on the road surface) from critical driving behavior (CDB) detection. In some implementations, machine learning-based anomaly detection algorithms are leveraged to detect the abnormal following vehicles based on the following vehicle's following time and the primary vehicle's critical driving behavior within the following time. In some implementations, a system for detecting abnormal following vehicles is provided. The system can include: at least one sensor configured to detect motion data of a first vehicle; a camera positioned within the first vehicle such that a field-of-view (FOV) of the camera faces outward from a rear of the first vehicle; a processor; memory having instructions stored thereon that, when executed by the processor, cause the system to: continuously capture video data of the camera's FOV; detect following vehicles in the captured video data using an object detection model; determine an amount of time that each of the following vehicles follows the first vehicle; determine critical driving behavior of the first vehicle based on the detected motion data; and determine whether one or more of the following vehicles is exhibiting abnormal following behavior with respect to the first vehicle based on the amount of time that each of the following vehicles follows the first vehicle and the critical driving behavior of the first vehicle. In some implementations, the at least one sensor includes an inertial measurement unit (IMU). In some implementations, the at least one sensor, camera, processor, and memory are components of a smartphone. In some implementations, one or more of the at least one sensor, camera, processor, and memory are components of a vehicle computer. In some implementations, the object detection model is a deep convolution neural network. In some implementations, the object detection model is a You Only Look Once (YOLO) algorithm. In some implementations, the amount of time (TID) that each of the following vehicles follows the first vehicle is calculated as: TID=NL-NFfr where fr denotes a frame rate of the camera, NF denotes a time that the respective following vehicle was first detected in the video data, and NL denotes a time that the respective following vehicle is no lo