CN-115904061-B - Gaze defect compensation
Abstract
An eye tracking system includes a controller configured to receive eye measurement data associated with an eye for one or both of a left eye and a right eye of a user, determine an optical axis of the eye from the eye measurement data, and select one of a plurality of eye models based on a direction of the optical axis, and determine a gaze vector of the eye by applying the selected eye model to the eye measurement data.
Inventors
- Daniel Johnson tonils
Assignees
- 托比股份公司
Dates
- Publication Date
- 20260505
- Application Date
- 20220928
- Priority Date
- 20210930
Claims (19)
- 1. An eye tracking system comprising a controller comprising an optical axis detector and a gaze estimation module, wherein the controller is configured to: receiving, using the optical axis detector, eye measurement data associated with an eye of a user; determining an optical axis of the eye from the eye measurement data using the optical axis detector; Selecting one of a plurality of eye models based on a direction of the optical axis using the gaze estimation module to compensate for gaze imperfections characterized by eye misalignment, and A gaze vector of the eye is determined using the gaze estimation module by applying the selected eye model to the eye measurement data.
- 2. The eye tracking system of claim 1, wherein the controller is configured to select one of a plurality of predetermined eye models based on the direction of the optical axis.
- 3. The eye tracking system of claim 1, wherein the controller is configured to select one of a plurality of eye models by: selecting a gaze offset value from a plurality of gaze offset values based on the direction of the optical axis, and The selected gaze offset value is applied to the reference eye model.
- 4. The eye tracking system of claim 2, wherein the controller is configured to select one of a plurality of eye models by: selecting a gaze offset value from a plurality of gaze offset values based on the direction of the optical axis, and The selected gaze offset value is applied to the selected predetermined eye model.
- 5. The eye tracking system of claim 1, wherein the controller is configured to determine the plurality of eye models by determining at least one of the following during a calibration process: Defining a plurality of eye modeling parameter sets corresponding to a plurality of predetermined eye models; a reference eye model and a plurality of gaze offset values for application to the reference eye model, or Defining the plurality of eye modeling parameter sets of the corresponding plurality of predetermined eye models and a plurality of gaze offset values for application to a predetermined eye model of the plurality of predetermined eye models, and wherein the plurality of eye modeling parameter sets or the plurality of gaze offset values correspond to a plurality of different regions of the stimulus point, and each region is associated with a different quadrant of the user's field of view.
- 6. The eye tracking system of claim 5, wherein the controller is configured to: Causing a plurality of stimulation points to be displayed to a user one at a time, and Eye measurement data is received for each stimulus point.
- 7. The eye tracking system of claim 6, wherein the plurality of stimulus points comprises 6 or more stimulus points.
- 8. The eye tracking system of claim 6, wherein each of the plurality of eye modeling parameter sets and the plurality of gaze offset values corresponds to: Each stimulation point, or The area of the stimulation point.
- 9. The eye tracking system of claim 5, wherein the controller is configured to: determining a left eye gaze vector of a left eye of the user; Determining a right eye gaze vector of a right eye of the user; Determining weights for each of the left eye gaze vector and the right eye gaze vector based on a left eye model selected for the left eye and a right eye model selected for the right eye, wherein the weights are based on a magnitude of a gaze offset value or a change in a set of eye modeling parameters associated with the selected eye model, and Applying weights for the left eye gaze vector to the left eye gaze vector and weights for the right eye gaze vector to provide a combined gaze vector.
- 10. The eye tracking system of claim 9, wherein the controller is configured to determine the weight based on a magnitude of the gaze offset value associated with the selected eye model for each of the left eye gaze vector and the right eye gaze vector.
- 11. The eye tracking system of claim 10, wherein the controller is configured to determine the weight from a change in gaze offset values associated with the selected eye model relative to adjacent values of the plurality of gaze offset values.
- 12. The eye tracking system of claim 9, wherein the controller is configured to determine the weight from a change in a value of the eye modeling parameter set associated with the selected eye model relative to adjacent values of the plurality of eye modeling parameter sets.
- 13. The eye tracking system of claim 9, wherein the controller is configured to determine a plurality of weights during the calibration process, each weight of the plurality of weights corresponding to at least one of: The plurality of sets of eye modeling parameters, The plurality of gaze offset values, or The plurality of stimulation points.
- 14. The eye tracking system of claim 9, wherein each weight comprises a value from 0 to 1.
- 15. The eye tracking system of claim 9, wherein the controller is further configured to: Redisplaying the plurality of stimulation points to the user one at a time, and For each stimulus point of the plurality of stimulus points that are redisplayed: Receiving eye measurement data; selecting an eye model corresponding to the stimulus point; calculating a gaze vector using the selected eye model and the eye measurement data; Calculating the difference between the calculated gaze vector and the known gaze vector corresponding to the stimulus point, and Weights are determined based on the differences.
- 16. A head-mounted device comprising the eye tracking system of any preceding claim.
- 17. An eye tracking method, the method comprising: receiving, using an optical axis detector, eye measurement data associated with an eye of a user; Determining an optical axis of the eye from the eye measurement data using the optical axis detector, and Selecting one of a plurality of eye models based on the direction of the optical axis using a gaze estimation module to compensate for gaze imperfections characterized by eye misalignment, and A gaze vector of the eye is determined using the gaze estimation module by applying the selected eye model to the eye measurement data.
- 18. A method of calibrating an eye tracking system, the method comprising: causing, by the controller, a plurality of stimulation points to be displayed to the user's eyes one at a time; receiving eye measurement data for each stimulus point using an optical axis detector; determining, based on the eye measurement data, at least one of: Defining a plurality of eye modeling parameter sets corresponding to a plurality of predetermined eye models; a reference eye model and a plurality of gaze offset values for application to the reference eye model, or Defining the plurality of eye modeling parameter sets for the corresponding plurality of predetermined eye models and a plurality of gaze offset values for application to one of the plurality of predetermined eye models, Wherein, the The plurality of eye modeling parameter sets and/or the plurality of gaze offset values correspond to: each of the plurality of stimulus points or a region of the plurality of stimulus points, and wherein the calibration is performed to define a strabismus compensation pattern for the eye tracking system.
- 19. One or more non-transitory computer-readable storage media storing computer-executable instructions which, when executed by a computing system, cause the computing system to perform the method of claim 17 or claim 18.
Description
Gaze defect compensation Technical Field The present disclosure relates generally to the field of eye tracking. In particular, the present disclosure relates to eye tracking systems and methods for providing accurate eye tracking in the presence of health-related eye defects. Background In eye tracking applications, a digital image of a user's eye is taken and analyzed to estimate the user's gaze direction. The estimation of gaze direction may be based on computer-based image analysis of the features of the imaged eye. One known example method of eye tracking includes the use of infrared light and an image sensor. The infrared light is directed towards the pupil of the user and the reflection of the light is captured by the image sensor. Many eye tracking systems estimate gaze direction based on the pupil position and the identification of glints or corneal reflections. The eye tracking system may include a calibration sequence for defining an eye tracking model that may map pupil position and glints to gaze directions. However, such eye tracking models may perform poorly for users with health-related eye defects. Portable or wearable eye tracking devices have also been described previously. One such eye tracking system is described in U.S. patent No. 9,041,787 (the entire contents of which are incorporated herein by reference). A wearable eye tracking device is described that uses an illuminator and an image sensor to determine a gaze direction. Disclosure of Invention According to a first aspect of the present disclosure, there is provided an eye tracking system comprising a controller configured to, for one or both of a left eye and a right eye of a user: Receiving eye measurement data associated with an eye; Determining an optical axis of the eye from the eye measurement data, and Selecting one of a plurality of eye models based on the direction of the optical axis, and A gaze vector of the eye is determined by applying the selected eye model to the eye measurement data. The eye tracking system may advantageously implement a gaze-dependent eye model that applies different computational processes depending on the direction of the optical axis/user gaze. By selecting one of a plurality of eye models based on the direction of the optical axis, the eye tracking system may advantageously compensate for strabismus or other gaze-related deficiencies of one or both eyes. The controller may be configured to select one of a plurality of predetermined eye models based on the direction of the optical axis. The controller may be configured to select one of the plurality of eye models by selecting a certain gaze offset value from the plurality of gaze offset values based on the direction of the optical axis, and applying the selected gaze offset value to the reference eye model. The controller may be configured to select one of the plurality of eye models by selecting a certain gaze offset value from a plurality of gaze offset values based on the direction of the optical axis, and applying the selected gaze offset value to the selected predetermined eye model. The controller may be configured to determine the plurality of eye models by determining, during a calibration process, a plurality of eye modeling parameter sets defining a corresponding plurality of predetermined eye models, a reference eye model and a plurality of gaze offset values for application to the reference eye model, or a plurality of eye modeling parameter sets defining a corresponding plurality of predetermined eye models and a plurality of gaze offset values for application to one of the predetermined eye models. The controller may be configured to determine the plurality of eye models by determining, in a calibration process, a plurality of eye modeling parameter sets defining a corresponding plurality of predetermined eye models for a corresponding plurality of stimulus points or stimulus point areas, a reference eye model and a plurality of gaze offset values for the corresponding plurality of stimulus points or stimulus point areas, wherein the gaze offset values are for application to the reference eye model, or a plurality of eye modeling parameter sets defining a corresponding plurality of predetermined eye models for the corresponding plurality of stimulus point areas, and a plurality of gaze offset values for a corresponding subset of stimulus point areas, wherein the gaze offset values are for application to one of the predetermined eye models. The controller may be configured to cause a plurality of stimulation points to be displayed to the user one at a time and to receive eye measurement data for each stimulation point. The plurality of stimulation points may include 6 or more stimulation points. Each of a plurality of eye modeling parameter sets, and/or a plurality of gaze offset values may correspond to each stimulus point, or a stimulus point region. The controller may be configured to determine a gaze vector of a left eye of the