EP-3553761-B1 - METHOD AND DEVICE FOR PERFORMING DIFFERENTIAL ANALYSIS OF VEHICLES
Inventors
- BOULTON, ADAM JOHN
Dates
- Publication Date
- 20260506
- Application Date
- 20190412
Claims (14)
- A method (300) for vehicle differential analysis at a computing device (106) in a vehicle (104), the method (300) comprising: obtaining (302) images associated with a proximate vehicle (102, 502); maintaining a window of the obtained images; obtaining (304) sensor data associated with the proximate vehicle (102, 502); determining (306) from the sensor data that the proximate vehicle (102, 502) is turning or changing lanes; determining (308) an expected visual cue associated with the proximate vehicle (102, 502) turning or changing lanes; detecting (310) a difference between the expected visual cue and the maintained window of images obtained prior to determining that the proximate vehicle (102, 502) is turning or changing lanes; and transmitting (312) an indication of the detected difference through a communications interface to a network (410).
- The method (300) of claim 1, further comprising obtaining identification information associated with the proximate vehicle (102, 502).
- The method (300) of claim 2, wherein the identification information is obtained from the images from a license plate of the proximate vehicle (102, 502).
- The method (300) of claim 3, further comprising using optical character recognition on the license plate to determine a vehicle identification information.
- The method (300) of claim 4, wherein transmitting the indication of the detected difference through the communications interface to a network (410) includes the identification information.
- The method (300) of any previous claim, further comprising performing an action by the vehicle (104) in response to the detected difference.
- The method (300) of claim 6, wherein the vehicle (104) is an autonomous vehicle, and wherein the action comprises changing lanes.
- The method (300) of any previous claim, wherein the sensor data is provided from one or more of a radar unit, an ultrasonic unit, a LIDAR unit and a camera.
- The method (300) of any previous claim wherein the expected visual cue is a turn signal of the proximate vehicle (102, 502).
- The method (300) of any previous claim, wherein the images are received from a forward facing camera of the vehicle (104).
- A computing device (106) in a vehicle (104) for providing differential analysis, the computing device (106) comprising: a processor (202); and a communications subsystem (214), wherein the computing device (106) is configured to: obtain images associated with a proximate vehicle (102, 502); maintain a window of the obtained images; obtain sensor data associated with the proximate vehicle (102, 502); determine from the sensor data that the proximate vehicle (102, 502) is turning or changing lanes; determine an expected visual cue associated with the proximate vehicle (102, 502) turning or changing lanes; detect a difference between the expected visual cue and the maintained window of images obtained prior to determining that the proximate vehicle (102, 502) is turning or changing lanes; and transmit an indication of the detected difference through a communications subsystem to a network (410).
- The computing device (106) of claim 11, wherein the computing device (106) is further configured to obtain identification information associated with the proximate vehicle (102, 502).
- The computing device (106) of claim 11, wherein the computing device (106) is further configured use optical character recognition on a license plate to determine a vehicle identification information.
- The computing device (106) of claim 11, wherein the computing device (106) is further configured perform an action by the vehicle (104) in response to the detected difference.
Description
TECHNICAL FIELD The invention relates to detecting possible issues of vehicles on the road using visual characteristics. BACKGROUND Vehicles may have different levels of automation associated with their driving mode and type of driving scenarios. The level of automation of a vehicle may be between level 0, indicating that there is zero autonomy and the driver is responsible for performing all driving tasks, and level 5 indicating that the vehicle is fully autonomous and is capable of performing all driving tasks under all conditions without driver input. Autonomous or semi-autonomous vehicles may improve the safety of roadways by improving driver's reactions, for example with emergency braking, or for fully autonomous vehicles removing the human driver from driving decisions. While autonomous or semi-autonomous vehicles may help improve overall safety of the roadway through improved driving decisions and/or reactions of the autonomous or semi-autonomous vehicles, the overall safety also depends upon all of the vehicles sharing the roadway, regardless of levels of automation, being in good working order. It is desirable to be able to monitor vehicles on a roadway in order to improve the overall safety of the vehicles travelling on the roadways. US2017316694 (A1) describes a vehicle which detects a failure of rear brake lamps of a preceding vehicle. The vehicle includes a distance detection unit that detects a distance from the vehicle to the preceding vehicle, an image acquisition unit that acquires an image of the preceding vehicle, and a controller that detects the failure of a rear brake lamp of the preceding vehicle occurs using a speed of the preceding vehicle obtained using information of variations in the detected distance and the acquired image of the preceding vehicle. Additionally, the controller generates acceleration and deceleration state information of the preceding vehicle in response to detecting the failure of the rear brake lamp of the preceding vehicle. In US2017316693 (A1), a traffic information big data operation server using license plate recognition of means of transportation includes: a transportation information integrating unit which receives an image of a license plate of the transportation means, and transportation information containing acquisition time and acquisition point of the image from a plurality of user terminals, and integrates a identification number, time and location of the transportation means from the transportation information to organize traffic information big data; a target information acquiring unit which receives an identification number of a target transportation means from a control terminal; a target recent location acquiring unit which acquires target time and target location corresponding to the identification number of the target transportation means; an intersection probability calculating unit; and a target current location predicting unit. EP1341140 (A1) relates to a vehicle control apparatus in which the use of a camera has enhanced the safety. A distance detecting sensor detects the distance between a vehicle, which is positioned ahead of a self-vehicle, and the self-vehicle. Also, the camera detects the vehicle positioned ahead of the self-vehicle. If the distance detecting sensor detects the ahead-positioned vehicle, a control unit utilizes the camera, thereby judging the state of rear lamps of the ahead-positioned vehicle. Then, if the control unit has detected a malfunction of the rear lamps of the ahead-positioned vehicle, the control unit, by a communication unit, communicates the malfunction to the driver of the ahead-positioned vehicle and/or to a public agency. BRIEF DESCRIPTION OF THE DRAWINGS The present disclosure will be better understood with reference to the drawings, in which: FIG. 1 depicts a roadway environment in which vehicle differential analysis may be performed;FIG. 2 depicts vehicle components for performing vehicle differential analysis;FIG. 3 depicts a method of vehicle differential analysis;FIG. 4 depicts components of a system for improving vehicle safety using vehicle differential analysis; andFIG. 5 depicts an additional roadway environment for improving vehicle safety using vehicle differential analysis. DETAILED DESCRIPTION In accordance with an aspect of the present invention there is provided a method for vehicle differential analysis at a computing device in a vehicle, the method comprising: obtaining images associated with a proximate vehicle; maintaining a window of the obtained images; obtaining sensor data associated with the proximate vehicle; determining from the sensor data that the proximate vehicle is turning or changing lanes; determining an expected visual cue associated with the proximate vehicle turning or changing lanes; detecting a difference between the expected visual cue and the maintained window of images obtained prior to determining that the proximate vehicle is turning or changing lanes; and