Search

US-20260126855-A1 - Eye Tracking Data Filtering

US20260126855A1US 20260126855 A1US20260126855 A1US 20260126855A1US-20260126855-A1

Abstract

Eye tracking is performed by determining an initial pupil position of a user in relation to a lens situated in front of the user, detecting a change in pupil position in relation to the lens to an updated pupil position in relation to the lens, and determining that the updated pupil position is outside a bounding box associated with the lens. The updated pupil position is a replacement pupil position with a replacement pupil position within the bounding box associated with the lens, and the updated pupil position is utilized for eye-tracking functionality. Eye tracking is also performed by determining that a first pixel associated with a gaze direction is outside a visibility region, identifying a replacement pixel within the visibility region, determining an updated gaze angle based on the replacement pixel, and performing eye tracking using the updated gaze angle.

Inventors

  • Jacob Wilson
  • Tobias Eble
  • Sabine Webel
  • Hariprasad Puthukkootil Rajagopal
  • Andreas Gapel
  • Ritesh Gangadhar Sholapur

Assignees

  • APPLE INC.

Dates

Publication Date
20260507
Application Date
20260105

Claims (20)

  1. 1 . A method comprising: determining a gaze direction of a user gazing at a display from a position of an eye; identifying a first pixel on the display at which the gaze direction is targeted; determining that the first pixel is outside a portion of a display visible through a lens, wherein the lens is situated between the eye and the display; in response to a determination that the first pixel is outside the portion of the display visible through the lens: identifying a replacement pixel within the portion of the display visible through the lens and based on the gaze direction, determining a replacement gaze direction based on the position of the eye and a location of the replacement pixel on the display; and performing an eye tracking function using the replacement gaze direction.
  2. 2 . The method of claim 1 , wherein identifying the replacement pixel comprises: determining that the replacement pixel is a closest pixel to the first pixel from within the visibility region.
  3. 3 . The method of claim 1 , wherein identifying the replacement pixel further comprises: identifying a central pixel associated with a center of the visibility region; determining a vector from the central pixel to the first pixel on the display; and selecting the replacement pixel from a set of pixels on the display along the vector and within the portion of the display visible through the lens.
  4. 4 . The method of claim 1 , wherein the portion of the display visible through the lens is further defined by a visibility mask based on a portion of the display determined to provide valid pixel data.
  5. 5 . The method of claim 1 , wherein the portion of the display visible through the lens is determined based on hardware specifications for the display.
  6. 6 . The method of claim 1 , wherein the portion of the display visible through the lens is defined by software utilizing the eye tracking function.
  7. 7 . The method of claim 1 , wherein the display and the lens are comprised in a head mounted device.
  8. 8 . A non-transitory computer readable medium comprising computer readable code executable by one or more processors to: determine a gaze direction of a user gazing at a display from a position of an eye; identify a first pixel on the display at which the gaze direction is targeted; determine that the first pixel is outside a portion of a display visible through a lens, wherein the lens is situated between the eye and the display; in response to a determination that the first pixel is outside the portion of the display visible through the lens: identify a replacement pixel within the portion of the display visible through the lens and based on the gaze direction, determine a replacement gaze direction based on the position of the eye and a location of the replacement pixel on the display; and perform an eye tracking function using the replacement gaze direction.
  9. 9 . The non-transitory computer readable medium of claim 8 , wherein the computer readable code to identify the replacement pixel comprises computer readable code to: determine that the replacement pixel is a closest pixel to the first pixel from within the visibility region.
  10. 10 . The non-transitory computer readable medium of claim 8 , wherein the computer readable code to identify the replacement pixel comprises computer readable code to: identify a central pixel associated with a center of the visibility region; determine a vector from the central pixel to the first pixel on the display; and select the replacement pixel from a set of pixels on the display along the vector and within the portion of the display visible through the lens.
  11. 11 . The non-transitory computer readable medium of claim 8 , wherein the portion of the display visible through the lens is further defined by a visibility mask based on a portion of the display determined to provide valid pixel data.
  12. 12 . The non-transitory computer readable medium of claim 8 , wherein the portion of the display visible through the lens is determined based on hardware specifications for the display.
  13. 13 . The non-transitory computer readable medium of claim 8 , wherein the portion of the display visible through the lens is defined by software utilizing the eye tracking function.
  14. 14 . The non-transitory computer readable medium of claim 8 , wherein the display and the lens are comprised in a head mounted device.
  15. 15 . A system comprising: one or more processors; and one or more computer readable media comprising computer readable code executable by the one or more processors to: determine a gaze direction of a user gazing at a display from a position of an eye; identify a first pixel on the display at which the gaze direction is targeted; determine that the first pixel is outside a portion of a display visible through a lens, wherein the lens is situated between the eye and the display; in response to a determination that the first pixel is outside the portion of the display visible through the lens: identify a replacement pixel within the portion of the display visible through the lens and based on the gaze direction, determine a replacement gaze direction based on the position of the eye and a location of the replacement pixel on the display; and perform an eye tracking function using the replacement gaze direction.
  16. 16 . The system of claim 15 , wherein the computer readable code to identify the replacement pixel comprises computer readable code to: determine that the replacement pixel is a closest pixel to the first pixel from within the visibility region.
  17. 17 . The system of claim 15 , wherein the computer readable code to identify the replacement pixel comprises computer readable code to: identify a central pixel associated with a center of the visibility region; determine a vector from the central pixel to the first pixel on the display; and select the replacement pixel from a set of pixels on the display along the vector and within the portion of the display visible through the lens.
  18. 18 . The system of claim 15 , wherein the portion of the display visible through the lens is further defined by a visibility mask based on a portion of the display determined to provide valid pixel data.
  19. 19 . The system of claim 15 , wherein the portion of the display visible through the lens is determined based on hardware specifications for the display.
  20. 20 . The system of claim 15 , wherein the display and the lens are comprised in a head mounted device.

Description

BACKGROUND This disclosure relates generally to image processing. More particularly, but not by way of limitation, this disclosure relates to techniques and systems for generating and managing eye-tracking data for improved eye-tracking techniques. Eye tracking is a technique utilized in many fields, such as gaze detection, pose estimation, facial analysis and recognition, and the like. Eye tracking often forms the basis of these operations and may be thought of as the process of electronically locating the point of a person's gaze or following and recording the movement of the person's point of gaze. In practice, eye tracking is provided by locating and tracking a pupil location and gaze direction. However, sensors used to track the eye often shift or jitter such that the eye-tracking data is not always perfectly calibrated. Accordingly, the eye-tracking data can be jittery or invalid. What is needed is an improved technique for managing eye-tracking data. BRIEF DESCRIPTION OF THE DRAWINGS FIGS. 1A-B shows example diagrams of a setup for eye tracking and performing pupil location. FIG. 2 shows a flowchart of a technique for managing pupil position information for eye-tracking techniques, according to one or more embodiments. FIG. 3 shows a flowchart of a technique for refining a pupil position for eye tracking, according to one or more embodiments. FIGS. 4A-B show example diagrams of a gaze direction with respect to a visibility region, according to one or more embodiments. FIGS. 5A-B show flowcharts of techniques for refining a gaze angle for eye-tracking techniques, according to one or more embodiments. FIG. 6 shows a flowchart of a technique for identifying a replacement pixel to refine a gaze angle, according to one or more embodiments. FIG. 7 shows, in block diagram form, a multifunction electronic device in accordance with one or more embodiments. FIG. 8 shows, in block diagram form, a computer system in accordance with one or more embodiments. DETAILED DESCRIPTION This disclosure pertains to systems, methods, and computer-readable media to refine and smooth eye-tracking data for enhanced performance in eye-tracking techniques. In one or more embodiments, a pupil position is clamped within a predetermined region. A lens through which a pupil is monitored may have a predetermined region which is a known calibrated region, whereas other regions of the lens may be less calibrated. As such, the pupil location should be maintained within the calibrated region. In one or more embodiments, an initial pupil location is determined in relation to a lens situated in front of the eye. A change in the pupil position may be detected in relation to the lens to an updated pupil position in relation to the lens. If the updated pupil location is outside the determined bounding box, for example, associated with a calibrated region, then a replacement pupil location is selected within the bounding box. The updated pupil location is then used for eye-tracking techniques. In some embodiments, in order to avoid a jitter, the eye-tracking system may change the pupil location over a series of frames such that the pupil location transitions from the original pupil location to the replacement pupil location, for example, based on a time-based easing function. In one or more embodiments, a gaze direction may be refined for use in eye-tracking techniques. In particular, the gaze direction is refined such that a user is gazing toward a visibility region. A first pixel may be determined as being associated with a gaze direction outside a visibility region. A replacement pixel is identified within the visibility region. The replacement pixel may be selected from along a vector from a center of a field of view to the first pixel. A gaze angle is determined based on a location of the replacement pixel, and an eye-tracking function is performed using the gaze angle. For purposes of this disclosure, the term “gaze origin” refers to the center of an eye for which gaze is determined. For purposes of this disclosure, the term “pupil position” refers to the position on the surface of the eye where the pupil is located. For purposes of this disclosure, the term “gaze direction” is a direction of a gaze originating from the gaze origin and passing through the pupil position. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed concepts. As part of this description, some of this disclosure's drawings represent structures and devices in block diagram form in order to avoid obscuring the novel aspects of the disclosed concepts. In the interest of clarity, not all features of an actual implementation may be described. Further, as part of this description, some of this disclosure's drawings may be provided in the form of flowcharts. The boxes in any particular flowchart may be presented in a particular order. It should be understood, howeve