Search

US-12625548-B2 - Eye tracking loss mitigations

US12625548B2US 12625548 B2US12625548 B2US 12625548B2US-12625548-B2

Abstract

Various implementations disclosed herein include devices, systems, and methods that mitigate eye tracking loss. For example, a process may capture sensor data of a 3D region. The process may further generate eye tracking data comprising a 3D position of an eye based on tracking the 3D position of the eye. The 3D position may be updated over time based on the captured sensor data. The process may further detect a condition interfering with updating the 3D position of the eye based on the captured sensor data. The process may further update the eye tracking data using a smoothed 3D position of the eye. The smoothed 3D position may be determined based on previously-tracked 3D positions of the portion of the eye and a current 3D position of the portion of the eye determined based on current sensor data.

Inventors

  • Christian W Gosch
  • Riley C Borgard
  • Sheila M Santos
  • Ritesh Gangadhar Sholapur
  • Sabine Webel

Assignees

  • APPLE INC.

Dates

Publication Date
20260512
Application Date
20240528

Claims (19)

  1. 1 . A method comprising: at a head mounted device (HMD) having a processor, a sensor, and a display: capturing sensor data of a 3D region via the sensor, wherein the 3D region is in front of the display of the HMD that an eye of a user is expected to occupy while the HMD is worn by the user; generating eye tracking data comprising a 3D position of a portion of an eye based on tracking the 3D position of the portion of the eye, wherein the 3D position is updated over time based on the captured sensor data; detecting a condition interfering with updating the 3D position of the portion of the eye based on the captured sensor data; and following the condition, updating the eye tracking data using a smoothed 3D position of the portion of the eye, the smoothed 3D position determined based on one or more previously-tracked 3D positions of the portion of the eye and a current 3D position of the portion of the eye determined based on current sensor data, wherein the smoothed 3D position gradually reaches the current 3D position at a rate determined based on a confidence level of the current 3D position, and wherein the confidence level is configured to decrease as a function of an angular distance associated with a gaze direction of the eye.
  2. 2 . The method of claim 1 , further comprising during the condition, updating the eye tracking data using the one or more previously-tracked 3D positions of the portion of the eye.
  3. 3 . The method of claim 1 , wherein the confidence level is dependent on a position of the portion of the eye with respect to the 3D region.
  4. 4 . The method of claim 1 , further comprising applying a hysteresis based on a threshold angular distance associated with the angular distance.
  5. 5 . The method of claim 1 , wherein the portion of the eye is a pupil.
  6. 6 . The method of claim 1 , wherein the condition is a blinking motion associated with the eye or a rapid motion associated with movement of the eye.
  7. 7 . The method of claim 1 , wherein the sensor data is generated based on an IR glint pattern reflecting off of a surface of the eye, and wherein the sensor captures images of the IR glint pattern.
  8. 8 . The method of claim 7 , wherein determining the 3D position of the portion of the eye is based on the IR glint pattern captured in the images.
  9. 9 . A head mounted device (HMD) comprising: a non-transitory computer-readable storage medium; and one or more processors coupled to the non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium comprises program instructions that, when executed on the one or more processors, cause the HMD to perform operations comprising: capturing sensor data of a 3D region via a sensor of the HMD, wherein the 3D region is in front of a display of the HMD that an eye of a user is expected to occupy while the HMD is worn by the user; generating eye tracking data comprising a 3D position of a portion pupil of an eye based on tracking the 3D position of the pupil of the eye, wherein the 3D position is updated over time based on the captured sensor data; detecting a condition interfering with updating the 3D position of the portion of the eye based on the captured sensor data; locking a last known pupil position of one or more previously-tracked 3D positions of pupil coordinates of the pupil of the eye until a current 3D position of pupil coordinates of the pupil of the eye is determined based on current sensor data; and following the condition and the locking, updating the eye tracking data using a smoothed 3D position of pupil coordinates of the pupil of the eye, the smoothed 3D position determined based on the one or more previously-tracked 3D positions of pupil coordinates of the pupil of the eye and the current 3D position of pupil coordinates of the pupil of the eye.
  10. 10 . The HMD of claim 9 , wherein the program instructions, when executed on the one or more processors, further cause the HMD to perform operations comprising: during the condition, updating the eye tracking data using the one or more previously-tracked 3D positions of pupil coordinates of the pupil of the eye.
  11. 11 . The HMD of claim 9 , wherein the smoothed 3D position gradually reaches the current 3D position at a rate determined based on a confidence level of the current 3D position.
  12. 12 . The HMD of claim 11 , wherein the confidence level is dependent on a position of the pupil of the eye with respect to the 3D region.
  13. 13 . The HMD of claim 11 , wherein the confidence level is configured to decrease as a function of an angular distance associated with a gaze direction of the eye.
  14. 14 . The HMD of claim 13 , wherein the program instructions, when executed on the one or more processors, further cause the HMD to perform operations comprising: applying a hysteresis based on a threshold angular distance associated with the angular distance.
  15. 15 . The HMD of claim 9 , wherein the condition is a blinking motion associated with the eye or a rapid motion associated with movement of the eye.
  16. 16 . The HMD of claim 9 , wherein the sensor data is generated based on an IR glint pattern reflecting off of a surface of the eye, and wherein the sensor captures images of the IR glint pattern.
  17. 17 . The HMD of claim 9 , further comprising: adjusting a display of the HMD based on the smoothed 3D position.
  18. 18 . The HMD of claim 17 , wherein the display is adjusted to compensate for lens distortion, chromatic aberration, or foveation.
  19. 19 . A non-transitory computer-readable storage medium storing program instructions executable via one or more processors, of a head mounted device (HMD), to perform operations comprising: capturing sensor data of a 3D region via a sensor, wherein the 3D region is in front of a display of the HMD that an eye of a user is expected to occupy while the HMD is worn by the user; generating eye tracking data comprising a 3D position of a pupil of an eye based on tracking the 3D position of the pupil of the eye, wherein the 3D position is updated over time based on the captured sensor data; detecting a condition interfering with updating the 3D position of the portion of the eye based on the captured sensor data; locking a last known pupil position of one or more previously-tracked 3D positions of pupil coordinates of the pupil of the eye until a current 3D position of pupil coordinates of the pupil of the eye is determined based on current sensor data; and following the condition and the locking, updating the eye tracking data using a smoothed 3D position of pupil coordinates of the pupil of the eye, the smoothed 3D position determined based on the one or more previously-tracked 3D positions of pupil coordinates of the pupil of the eye and the current 3D position of pupil coordinates of the pupil of the eye.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS This application claims the benefit of U.S. Provisional Application Ser. No. 63/470,504 filed Jun. 2, 2023, which is incorporated herein in its entirety. TECHNICAL FIELD The present disclosure generally relates to systems, methods, and devices that provide eye tracking based on sensor data during periods during which sensor data is not continuously available, e.g., periods during which eye tracking loss events occur. BACKGROUND Existing eye tracking techniques are used for various applications. For example, devices configure how content is displayed based on eye tracking and/or may determine responses to user eye activity based on eye tracking. Existing eye tracking techniques may not adequately account for periods during which sensor data is not continuously available, e.g., periods during which an image or other sensor data corresponding to the eye's current location and/or orientation are unavailable or insufficiently accurate or precise. SUMMARY Various implementations disclosed herein include devices, systems, and methods that mitigate a temporary eye tracking loss. A temporary eye tracking loss may be caused by, inter alia, a user blinking or a fast eye motion. There may be a sudden jump between a last tracked eye position before an eye tracking loss event and the next acquired eye position determined subsequent to the eye tracking loss event. In use cases in which content display is based on the eye tracking, the temporary eye tracking loss and/or associated jumps may result in display adjustments that result in visual anomalies, e.g., content items appearing to jump from one place to another. Various implementations avoid such sudden jumps and/or provide other benefits by providing a temporal smoothing of a three-dimensional (3D) position of a tracked portion of an eye (e.g., a pupil). The temporal smoothing of the 3D position may provide a lazy follow effect (with respect to the 3D position of the portion of the eye) implemented until a 3D smoothed position reaches a latest measured location. A smoothing process may be performed with respect to a transition rate that is dependent on a confidence level with respect to a current eye position measurement. The confidence level may be dependent on a pupil location within an eye box. An eye box may comprise a 3D region located in front of a display of a head mounted device (HMD). The 3D region may comprise a region that an eye is expected to occupy while the HMD is being worn by a user. In some implementations, the confidence level may decrease as a function of an angular distance. In some implementations, hysteresis may be applied based on a threshold angular distance with respect to the angular distance. In some implementations, an HMD device has a processor (e.g., one or more processors) that executes instructions stored in a non-transitory computer-readable medium to perform a method. The method performs one or more steps or processes. In some implementations, the HMD captures sensor data of a 3D region via a sensor of the HMD. The 3D region is in front of a display of the HMD that an eye of a user is expected to occupy while the HMD is worn by the user. In some implementations, eye tracking data is generated. The eye tracking data may include a 3D position of a portion of an eye based on tracking the 3D position of the portion of the eye. The 3D position may be updated over time based on the captured sensor data. In some implementations, a condition interfering with updating the 3D position of the portion of the eye is detected based on the captured sensor data. In some implementations, following the condition, the eye tracking data is updated using a smoothed 3D position of the portion of the eye. The smoothed 3D position may be determined based on one or more previously-tracked 3D positions of the portion of the eye and a current 3D position of the portion of the eye determined based on current sensor data. In accordance with some implementations, a device includes one or more processors, a non-transitory memory, and one or more programs; the one or more programs are stored in the non-transitory memory and configured to be executed by the one or more processors and the one or more programs include instructions for performing or causing performance of any of the methods described herein. In accordance with some implementations, a non-transitory computer readable storage medium has stored therein instructions, which, when executed by one or more processors of a device, cause the device to perform or cause performance of any of the methods described herein. In accordance with some implementations, a device includes: one or more processors, a non-transitory memory, and means for performing or causing performance of any of the methods described herein. BRIEF DESCRIPTION OF THE DRAWINGS So that the present disclosure can be understood by those of ordinary skill in the art, a more detailed description may be had by