US-12616366-B2 - Systems and methods for evaluating eye movement disorders through eye-tracking exercises
Abstract
A patient's visual health can be evaluated via a virtual reality (VR) system, which includes a VR headset in electronic communication with a computing device. The computing device causes virtual environments, which can include objects, to be displayed on the VR headset. Using varying combinations of eye-tracking sensors, eye-tracking cameras, motion-tracking sensors, handheld devices, and microphones, the VR headset collects data about the patient as she tracks the objects as the objects are displayed in different positions in the virtual environments. Optionally, advanced algorithms in the computing device dynamically alter the positions of the objects and analyze the patient's eye-tracking to evaluate the patient for eye movement disorders. This dynamic evaluation can facilitate a wider scope of testing and a more detailed assessment of the patient's ocular health, as compared to traditional ocular evaluation methods.
Inventors
- Steven Lee
- Julia ZHEN
- ChyrSong Ting
- Matthew James GOLINO
- Justin Paul DEMPSEY
- Jeffrey Joseph FILLINGHAM
Assignees
- Zenni Optical, Inc.
Dates
- Publication Date
- 20260505
- Application Date
- 20240821
Claims (17)
- 1 . A method of evaluating eye movement, the method comprising: displaying a virtual environment on screens of a virtual reality (VR) headset worn by a patient; displaying an object in the virtual environment at a first position; monitoring a first input of the patient, wherein the first input comprises an eye movement of the patient in response to the object being displayed at the first position; displaying the object in the virtual environment at a second position, wherein the second position is different from the first position; monitoring a second input of the patient, wherein the second input comprises the eye movement of the patient in response to the object being displayed at the second position; and comparing one or more of the first and second input to a database; wherein displaying the object at the first and second positions comprises incrementally adjusting a position of the object in the virtual environment, and wherein the first position is five degrees left of a center point in the virtual environment and the second position is five degrees right of the center point.
- 2 . The method of claim 1 , wherein displaying the object at the first and second positions comprises instantly showing the object in the first position, instantly removing the object from the first position, instantly the object in the second position, and instantly removing the object from the second position.
- 3 . The method of claim 1 , wherein displaying the object at the first and second positions comprises fading the object into the first position, fading the object out of the first position, fading the object into the second position, and fading the object out of the second position.
- 4 . The method of claim 1 , wherein displaying the object at the first and second positions comprises displaying the object as the object travels along a continuous path between the first and second positions.
- 5 . The method of claim 1 , wherein monitoring the first and second inputs comprises continuously monitoring the eye movements of the patient.
- 6 . The method of claim 1 , wherein monitoring the first and second inputs comprises identifying a delay between the object being displayed at the first and second positions and the eye movement of the patient in response to the object being displayed at the first and second positions.
- 7 . The method of claim 1 , wherein comparing one or more of the first and second input to the database comprises comparing one or more of the first and second input to the database in real-time as the patient responds to the object being displayed at the first and second positions.
- 8 . The method of claim 1 , further comprising quantifying deviation angles of the eye movements of the patient.
- 9 . The method of claim 8 , further comprising calculating a percent difference in the deviation angles of each eye of the patient.
- 10 . The method of claim 9 , further comprising recommending evaluation for eye movement disorders if the percent difference is 25% or more.
- 11 . A system for evaluating eye movement, the system comprising: a virtual reality (VR) headset comprising at least one eye-tracking sensor and at least one eye-tracking camera, the at least one eye-tracking sensor and the at least one camera being configured to collect eye movement data of a patient wearing the VR headset; and a computing device in electronic communication with the VR headset, the computing device being configured to cause an object to be displayed at various positions in a virtual environment and process the eye movement data to identify eye movement patterns of the patient, wherein the eye movement data comprises eye movements of the patient in response to the object being displayed at various positions in the virtual environment, wherein the various positions comprise a first position five degrees left of a center point in the virtual environment and a second position five degrees right of the center point.
- 12 . The system of claim 11 , further comprising a handheld device in electronic communication with the VR headset and the computing device, wherein the patient can use the handheld device to provide input in response to the object being displayed at various positions in the virtual environment.
- 13 . The system of claim 11 , wherein the eye movement data collected by the at least one eye-tracking sensor and the at least one eye-tracking camera comprises one or more of an accuracy, speed, and coordination of eye movements of the patient.
- 14 . The system of claim 11 , wherein the at least one eye-tracking sensor and the at least one eye-tracking camera are configured collect the eye movement data in real-time and communicate the eye movement data with the computing device in real-time.
- 15 . The system of claim 11 , wherein the at least one eye-tracking camera is an infrared camera.
- 16 . The system of claim 11 , wherein the at least one eye-tracking camera comprises at least two eye-tracking cameras with an eye-tracking camera pointed at each pupil of the patient.
- 17 . The system of claim 11 , wherein the computing devices comprises an algorithm that compares the eye movement patterns of the patient to a database to determine whether the eye movement patterns of the patient deviate from healthy ocular motor behavior.
Description
BACKGROUND Field of the Inventions The present application relates to methods of assessing various ocular conditions through extended reality systems. More specifically, methods and systems are applied to conduct visual tasks and exams in extended reality environments to evaluate patients for ocular conditions and diseases, such as eye misalignment and visual processing disorders. Description of the Related Art As virtual reality (VR) technology has become increasingly sophisticated, new highly immersive experiences have been made possible through improvements in head and motion tracking systems. Eye-tracking technology allows systems to detect and respond to where the user is looking. This capability enhances user interaction and makes virtual environments more responsive and engaging. Eye tracking is being integrated into a variety of VR applications, from gaming and training simulations to medical diagnostics and research, as it offers a more intuitive way for users to interact with digital content. SUMMARY The systems, methods and devices of this disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein. Despite the advancements in VR technology, and in particular, eye-tracking technology, in accordance with some embodiments disclosed herein is the realization that VR technology can provide unique eyecare solutions through monitoring and tracking one or more of the user's eyes and providing a diagnostic, treatment protocol, and/or treatment system. Indeed, in accordance with some embodiments disclosed herein is the realization that VR technology can be used to address challenges associated with diagnosing a variety of eye disorders and ocular conditions, such as detecting misalignment, macular degeneration, tear film characteristics, floater characteristics, eye tracking issues, motion sensitivity, and other eye movement disorders and the treatment of such. In some embodiments, a virtual reality (VR) guided examination can be conducted for diagnosing and measuring eye movement disorders through a series of interactive eye-tracking exercises. This method can be implemented through a system that comprises a high-resolution VR headset integrated with precision eye-tracking technology and specialized software capable of generating a variety of visual stimuli and tasks. Patients wear the VR headset and engage in a series of exercises specifically designed to assess different aspects of eye movement, such as saccades, smooth pursuits, and fixation stability. The eye-tracking sensors continuously monitor the patient's eye movements, providing real-time data on the accuracy, speed, and coordination of their ocular motor functions. Optionally, a VR software can employ a range of diagnostic tasks, such as following moving objects, rapidly shifting gaze between fixed points, and maintaining steady focus on a single target. These tasks are tailored to detect and measure anomalies in eye movements that are indicative of disorders like nystagmus, strabismus, and oculomotor nerve palsies. The software processes the eye-tracking data using advanced algorithms to identify patterns and deviations from normal ocular motor behavior. The results are compiled into a detailed report that highlights specific eye movement abnormalities, providing valuable insights for clinicians to diagnose and monitor the progression of eye movement disorders. To construct the VR-based eye movement disorder diagnostic system, begin with a high-quality VR headset, such as the Oculus Quest 2, integrated with high-precision eye-tracking technology. The eye-tracking sensors should include infrared cameras capable of capturing detailed and rapid eye movements with high accuracy. The software development involves creating a library of eye-tracking exercises designed to evaluate various aspects of eye movement. These exercises include tasks such as tracking moving targets, performing quick gaze shifts, and maintaining prolonged focus on static objects. Once the hardware and software components are integrated, the system undergoes calibration using a control group of individuals with known eye movement profiles to establish baseline performance metrics and validate the accuracy of the diagnostic algorithms. Users can then operate the system by wearing the VR headset and participating in the guided eye-tracking exercises within the virtual environments. The eye-tracking sensors monitor their eye movements and responses to the visual stimuli, while the software records and analyzes the data in real-time. The user receives a detailed report outlining their eye movement performance, highlighting any deviations from normal patterns, and providing recommendations for further medical consultation if necessary. This approach offers a precise, non-invasive, and user-friendly method for diagnosing and measuring eye movement disorders, providing substantial benefits for both clinic