EP-4738286-A1 - A COMPUTER-IMPLEMENTED METHOD FOR DETECTING DRIVER DISTRACTION IN A CURRENT TRAFFIC SCENE
Abstract
A computer-implemented method (20) for detecting driver distraction in a current traffic scene, comprising: - capturing (21) the driver's gaze positions over time; - constructing (22) a spatio-temporal graph representation of the current traffic scene, where detected objects in the current traffic scene and the driver's historical gaze positions data are represented as nodes; - analyzing (23) the driver's gaze position in relation to the detected objects in the current traffic scene; - estimating (24) a probability distribution of the driver's next gaze position based on the current traffic scene; and - comparing (25) the current measured gaze position to the estimated gaze probability distribution to detect deviations indicative of the driver distraction.
Inventors
- ABDELKAWY, Hazem
- PALMER, Luke
- PALASEK, Petar
Assignees
- TOYOTA JIDOSHA KABUSHIKI KAISHA
- Glimpse Technology Limited
Dates
- Publication Date
- 20260506
- Application Date
- 20241030
Claims (12)
- A computer-implemented method (20) for detecting driver distraction in a current traffic scene, comprising: - capturing (21) the driver's gaze positions over time; - constructing (22) a spatio-temporal graph representation of the current traffic scene, where detected objects in the current traffic scene and the driver's historical gaze positions data are represented as nodes; - analyzing (23) the driver's gaze position in relation to the detected objects in the current traffic scene; - estimating (24) a probability distribution of the driver's next gaze position based on the current traffic scene; and - comparing (25) the current measured gaze position to the estimated gaze probability distribution to detect deviations indicative of the driver distraction.
- The computer-implemented method (20) according to claim 1, further comprising triggering (26) an alert if the deviation exceeds a predefined threshold, indicating driver distraction.
- The computer-implemented method (20) according to claim 1 or 2, wherein analyzing (23) the driver's gaze position comprises executing (230) a graph transformer model.
- The computer-implemented method (20) according to claim 3, wherein the graph transformer model is configured to process both spatial and temporal relationships between objects in the traffic scene.
- The computer-implemented method (20) according to any one of claims 1 to 4, wherein the detected objects are classified into categories including vehicles, pedestrians, and traffic control devices, each being represented as distinct node types in the spatio-temporal graph.
- The computer-implemented method (20) according to any one of claims 1 to 5, wherein the historical gaze positions data are incorporated in the spatio-temporal graph by connecting gaze nodes across multiple timeframes.
- The computer-implemented method (20) according to any one of claims 1 to 6, wherein estimating (24) the probability of the driver's next gaze position comprises predicting (240) gaze shifts toward the detected objects using a Gaussian mixture model.
- A computer program set including instructions for executing the steps of the method (20) of any one of claims 1 to 7 when said program set is executed by at least one computer.
- A recording medium readable by at least one computer and having recorded thereon at least one computer program including instructions for executing the steps of the method (20) of any one of claims 1 to 7.
- An apparatus (10) for detecting driver distraction in a current traffic scene, comprising: - a capturing module (11) configured to capture the driver's gaze positions over time ; - a constructing module (12) configured to construct a spatio-temporal graph representation of the current traffic scene, where detected objects in the current traffic scene and the driver's historical gaze positions data are represented as nodes ; - an analyzing module (13) configured to analyze the driver's position in relation to the detected objects in the current traffic scene ; - an estimating module (14) configured to estimate a probability distribution of the driver's next gaze position based on the traffic scene ; and - a comparing module (15) configured to compare the current measured gaze position to the estimated gaze probability distribution to detect deviations indicative of the driver distraction.
- The apparatus (10) according to claim 10, wherein the capturing module (11) is an in-vehicle camera or an eye-tracking system.
- A vehicle (1) comprising an apparatus (10) according to claim 10 or 11.
Description
BACKGROUND OF THE INVENTION 1. Field of the invention The present disclosure relates to the field of automotive safety systems and, more particularly, to an apparatus and a method for detecting driver distraction in a current traffic scene. 2. Description of Related Art Increased traffic density on the roads necessitates that vehicle drivers maintain continuous and undivided attention to their surroundings. However, human capacity for sustained concentration is inherently limited, often leading to lapses in attention and critical driving errors. Statistically, human error is implicated in approximately 94% of road accidents, largely due to factors like driver inattention, distractions, and insufficient situational awareness. To address this, Advanced Driver Assistance Systems (ADAS) have been developed to reduce cognitive load on drivers, employing sensors such as cameras, LiDAR, and radar to monitor the vehicle's external environment. These sensors provide real-time data on the surroundings, enabling the system to identify potential hazards, such as vehicles, pedestrians, and obstacles. Using machine learning models, ADAS classifies these detected objects into predefined categories and determines the appropriate response - whether that involves autonomous actions, such as automatic braking, or issuing warnings through visual audio, or haptic feedback to alert the driver. However, one of the primary challenges with current ADAS technology is its limited ability to accurately recognize driver distraction which can come from a variety of sources; including both internal (e.g., fatigue or cognitive strain) and external (e.g., interactions with passengers or mobile devices) factors. While ADAS systems excel at detecting and responding to external hazards, it often overlooks a critical internal factor: the driver's level of attention. Thus, even when ADAS accurately identifies obstacles or road conditions, if the driver is distracted, their response may be delayed or ineffective, increasing the risk of accidents. Accordingly, recognizing driver distraction is essential, as it directly impacts road safety. To address this gap, integrating monitoring systems which utilizes in-vehicle cameras and sensors to track the driver's gaze is important to identify moments of distraction. The need for such an integration is particularly urgent as vehicles progress toward higher levels of automation (levels 2-4), where the driver may not always be fully engaged with driving tasks. Without robust monitoring of the driver's attention, even the most advanced ADAS could fail in situations where quick human intervention is needed. Therefore, there is a need to incorporate distraction recognition capabilities to ensure timely interventions, maintaining safety in increasing complex driving environments. SUMMARY The object of the present invention is to at least substantially address the aforementioned drawbacks. In this respect, the aim of the invention is to provide a computer-implemented method for detecting driver distraction in a current traffic scene, comprising: capturing the driver's gaze positions over time;constructing a spatio-temporal graph representation of the current traffic scene, where detected objects in the current traffic scene and the driver's historical gaze positions data are represented as nodes;analyzing the driver's gaze position in relation to the detected objects in the current traffic scene;estimating a probability distribution of the driver's next gaze position based on the current traffic scene; andcomparing the current measured gaze position to the estimated gaze probability distribution to detect deviations indicative of the driver distraction. The invention aims to improve the detection of driver distraction by addressing certain limitations in current systems. The method begins by tracking the driver's gaze positions continuously as they drive. This is done using for example sensors such as in-vehicle cameras or eye-tracking devices that can monitor where the driver is looking at any given moment. These gaze positions are recorded as time-series data, capturing the movement of the driver's eyes and attention over time. Next, the method builds a graph that represents the current traffic scene, where both the detected objects (such as vehicles, pedestrians, or traffic signs) and the driver's historical gaze data are modeled as nodes. The "spatio-temporal" aspect means that the graph captures both the spatial arrangement of objects in the scene and the timing of the driver's gaze behavior. This allows analyzing the relationships between the driver's focus points and the relevant traffic objects over time. Once the graph is constructed, the method evaluates how the driver's gaze positions align with the objects in the traffic scene. To predict future driver behavior, the method uses the traffic scene and the driver's historical gaze data to estimate a probability distribution of where the driver is like