US-12625544-B2 - Eye tracking system for determining user activity
Abstract
Embodiments relate to an eye tracking system. A headset of the system includes an eye tracking sensor that captures eye tracking data indicating positions and movements of a user's eye. A controller (e.g., in the headset) of the tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity.
Inventors
- Kevin Conlon Boyle
- Robert Konrad Konrad
- Nitish Padmanaban
Assignees
- SESAME AI, INC.
Dates
- Publication Date
- 20260512
- Application Date
- 20221103
Claims (16)
- 1 . A method comprising: analyzing eye tracking data to determine first eye tracking feature values for a first eye tracking feature of an eye of a user of a headset during a time period, the first eye tracking feature of the eye being a first characteristic of the eye, wherein the eye tracking data is determined from an eye tracking system on the headset; analyzing the eye tracking data to determine second eye tracking feature values for a second eye tracking feature of the eye of the user of the headset during the time period, the second eye tracking feature of the eye being a second characteristic of the eye and being different than the first eye tracking feature; determining an activity of the user during the time period based on the determined first eye tracking feature values and the determined second eye tracking feature values, wherein the activity of the user is determined without referencing an outward facing camera image, and wherein the determined activity of the user is not the first eye tracking feature values or the second eye tracking feature values; updating an activity history of the user with the determined activity; monitoring changes in the first eye tracking feature values and monitoring changes in the second eye tracking feature values; and determining that the user transitions from the activity to a second activity based on the monitored changes of the first eye tracking feature values and the second eye tracking feature values.
- 2 . The method of claim 1 , wherein determining the activity comprises identifying first eye tracking feature values and second eye tracking feature values that correspond to the activity.
- 3 . The method of claim 2 , wherein the first eye tracking feature values include movements of the eye, and determining the activity comprises identifying movements of the eye that correspond to the activity.
- 4 . The method of claim 1 , wherein determining the activity of the user comprises determining eye tracking feature vectors representing first eye tracking feature values and second eye tracking feature vectors for points in time during the time period, wherein at least one of the determined eye tracking feature vectors represents one of the first eye tracking feature values and one of the second eye tracking feature values for a point in time during the time period.
- 5 . The method of claim 4 , wherein the activity is determined by analyzing a distribution of the eye tracking feature vectors over the time period.
- 6 . The method of claim 4 , wherein determining the activity comprises: applying a vector clustering model to the eye tracking feature vectors to form activity clusters; and determining the activity based on activity clusters at points in time during the time period.
- 7 . The method of claim 1 , wherein multiple activities performed by the user throughout a day are determined.
- 8 . A non-transitory computer-readable storage medium comprising stored instructions, the instructions when executed by a computer device, causing the computer device to: analyze eye tracking data to determine first eye tracking feature values for a first eye tracking feature of an eye of a user of a headset during a time period, the first eye tracking feature of the eye being a first characteristic of the eye, wherein the eye tracking data is determined from an eye tracking system on the headset; analyzing the eye tracking data to determine second eye tracking feature values for a second eye tracking feature of the eye of the user of the headset during the time period, the second eye tracking feature of the eye being a second characteristic of the eye and being different than the first eye tracking feature; determine an activity of the user during the time period based on the determined first eye tracking feature values and the determined second eye tracking feature values, wherein the activity of the user is determined without referencing an outward facing camera image, and wherein the determined activity of the user is not the first eye tracking feature values or the second eye tracking feature values; update an activity history of the user with the determined activity; monitoring changes in the first eye tracking feature values and monitoring changes in the second eye tracking feature values; and determining that the user transitions from the activity to a second activity based on the monitored changes of the first eye tracking feature values and the second eye tracking feature values.
- 9 . The non-transitory computer-readable storage medium of claim 8 , wherein to determine the activity, the non-transitory computer-readable storage medium further comprises instructions that cause the computer device to identify first feature values and second feature values that correspond to the activity.
- 10 . The non-transitory computer-readable storage medium of claim 9 , wherein: the first feature values include movements of the eye; and to determine the activity, the non-transitory computer-readable storage medium further comprises instructions that cause the computer device to identify movements of the eye that correspond to the activity.
- 11 . The non-transitory computer-readable storage medium of claim 8 , wherein to determine the activity, the non-transitory computer-readable storage medium further comprises instructions that cause the computer device to determine eye tracking feature vectors representing first eye tracking feature values and second eye tracking feature vectors for points in time during the time period, wherein at least one of the determined eye tracking feature vectors represents one of the first eye tracking feature values and one of the second eye tracking feature values for a point in time during the time period.
- 12 . The non-transitory computer-readable storage medium of claim 11 , wherein to determine the activity, the non-transitory computer-readable storage medium further comprises instructions that cause the computer device to analyze a distribution of the eye tracking feature vectors over the time period.
- 13 . The non-transitory computer-readable storage medium of claim 11 , wherein to determine the activity, the non-transitory computer-readable storage medium further comprises instructions that cause the computer device to: apply a vector clustering model to the eye tracking feature vectors to form activity clusters; and determine the activity based on activity clusters at points in time during the time period.
- 14 . The non-transitory computer-readable storage medium of claim 8 , wherein the non-transitory computer-readable storage medium comprises instructions that cause the computer device to determine and track multiple activities of the user throughout a day.
- 15 . A headset comprising: one or more sensors embedded into a frame of the headset and configured to capture eye tracking data indicating positions and movements of an eye of a user of the headset; and a controller configured to: analyze eye tracking data from the one or more sensors to determine first eye tracking feature values for a first eye tracking feature of the eye during a time period, the first eye tracking feature of the eye being a first characteristic of the eye; analyzing the eye tracking data to determine second eye tracking feature values for a second eye tracking feature of the eye of the user of the headset during the time period, the second eye tracking feature of the eye being a second characteristic of the eye and being different than the first eye tracking feature; determine an activity of the user during the time period based on the determined first eye tracking feature values and the determined second eye tracking feature values, wherein the activity of the user is determined without referencing an outward facing camera image, and wherein the determined activity of the user is not the first eye tracking feature values or the second eye tracking feature values; update an activity history of the user with the determined activity; monitoring changes in the first eye tracking feature values and monitoring changes in the second eye tracking feature values; and determining that the user transitions from the activity to a second activity based on the monitored changes of the first eye tracking feature values and the second eye tracking feature values.
- 16 . The headset of claim 15 , wherein to determine the activity, the controller is further configured to identify first feature values and second eye tracking feature values that correspond to the activity.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit and priority of U.S. Provisional patent application Ser. No. 63/276,106, filed on Nov. 5, 2021, which is hereby incorporated by reference in its entirety. FIELD OF THE INVENTION This disclosure relates generally to an eye tracking system, and more specifically to an eye tracking system that determines a user's activity based on eye tracking data. BACKGROUND Current eye tracking systems include an outward facing camera and rely on projecting a user's gaze into an outward facing image of the world around the user. In these cases, the eye tracking system estimates the gaze vector of the user and projects it into an outward facing camera image to display where the user is looking in the world. From this, behavioral data about the user's activities and attention may be inferred. SUMMARY The human visual system can be a dominant part of human interaction with the world. As such, the behavior of the eyes can be linked to the activity the person is performing. Specifically, eye movements can reveal behavior associated with (e.g., visual) activities (e.g., looking at computer monitor vs reading a physical book). Due to this, an eye tracking system described herein can determine a user's activity by analyzing behavior of one or both of the user's eyes (e.g., without referencing an outward facing camera image). Embodiments of the present disclosure relate to an eye tracking system. An eye tracking system includes an eye tracking sensor and a controller (e.g., both part of a headset configured to be worn by a user). The eye tracking sensor is configured to capture eye tracking data that indicates eye tracking features of a user's eye (e.g., positions and movements of the user's eye). The controller of the eye tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an (e.g., visual) activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity. Other aspects include components, devices, systems, improvements, methods, processes, applications, computer readable mediums, and other technologies related to any of the above. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram of a platform that includes a headset, in accordance with one or more embodiments. FIG. 2 is a perspective view of a headset that can be integrated into the platform of FIG. 1, in accordance with one or more embodiments. FIG. 3 is a controller that may be integrated into the headset of FIG. 2, in accordance with one or more embodiments. FIG. 4 is a plot of eye tracking data of a user over time, in accordance with one or more embodiments. FIG. 5 is a k-means cluster plot of eye tracking feature vectors, in accordance with one or more embodiments. FIG. 6 is a plot of eye tracking feature vectors over time, in accordance with one or more embodiments. FIG. 7A illustrates a chart of a user's activities throughout a day and a chart of the user's focus score throughout the day, in accordance with one or more embodiments. FIG. 7B illustrates a smartphone displaying activity insights corresponding to the data displayed in FIG. 7A, in accordance with one or more embodiments. FIG. 8A includes the same charts as FIG. 7A displaying different data, in accordance with one or more embodiments. FIG. 8B illustrates a smartphone displaying activity insights corresponding to the data displayed in FIG. 8A, in accordance with one or more embodiments. FIG. 9 is a flow chart illustrating a process for determining a activity of a user, in accordance with one or more embodiments. The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein. DETAILED DESCRIPTION Embodiments of the present disclosure relate to eye tracking systems and methods for using eye tracking data, such as eye movement and gaze, to detect and classify (e.g., visual) activities a user performs over time (e.g., throughout a day). An eye tracking system may be configured to record eye tracking data all day across many contexts and activities. An eye tracking system includes an eye tracking sensor and a controller that processes data from the eye tracking sensor. An eye tracking system may be (e.g., in part) implemented in a headset and may be part of a networked environment. Example headsets in network environments are further described with respect to FIGS. 1 and 2. However, eye tracking systems are not required to be part of a headsets or networked environments. For example, an eye tracking system may be desk mounted or room mounted. FIG. 1 is a block diagram of a platform 100 that inc