Search

US-12616365-B2 - Methods and systems for implementing virtual vision test for night vision and glare sensitivity

US12616365B2US 12616365 B2US12616365 B2US 12616365B2US-12616365-B2

Abstract

A virtual eye test can be conducted to evaluate night vision and glare sensitivity in a virtual reality (VR) environment. The test can be conducted using an electronic device that includes a head-mounted display (HMD) and a camera. The electronic device can generate a VR user interface corresponding to a photorealistic virtual environment and render the VR user interface on the HMD. The electronic device can simulate one or more dynamic lighting scenarios and while simulating these scenarios, in real time, continuously track eye movements and response times to visual stimuli using the camera. The device can then evaluate user response based on the eye movements and response times for testing night vision and glare sensitivity.

Inventors

  • Steven Lee
  • Julia ZHEN
  • ChyrSong Ting
  • Matthew James GOLINO
  • Justin Paul DEMPSEY
  • Jeffrey Joseph FILLINGHAM

Assignees

  • Zenni Optical, Inc.

Dates

Publication Date
20260505
Application Date
20240829

Claims (20)

  1. 1 . A method of implementing a virtual vision test for night vision and glare sensitivity, comprising: at an electronic device including a head-mounted display (HMD) and a camera: generating a virtual reality (VR) user interface corresponding to a photorealistic virtual environment; rendering the VR user interface on the HMD; simulating one or more dynamic lighting scenarios in the VR user interface by using one or more light mapping techniques to simulate realistic light behavior, including scattering, shadowing, and reflections; and while simulating the one or more dynamic lighting scenarios, in real time: continuously tracking, using the camera, eye movements and response times in response to visual stimuli presented in the one or more dynamic lighting scenarios; and evaluating user response based on the eye movements and the response times for testing night vision and glare sensitivity.
  2. 2 . The method of claim 1 , wherein the photorealistic virtual environment comprises a high-fidelity virtual environment that can dynamically adjust light levels, colors, and/or sources.
  3. 3 . The method of claim 2 , wherein the high-fidelity virtual environment comprises dynamic light sources that incorporate movable light sources that can change intensity and position.
  4. 4 . The method of claim 1 , wherein the photorealistic virtual environment includes one or more configurable parameters to alter environment settings, while simulating the one or more dynamic lighting scenarios.
  5. 5 . The method of claim 1 , wherein the one or more dynamic lighting scenarios comprises randomized lighting scenarios that randomly change intensity from high intensity to low intensity and vice versa, without following real-world lighting scenarios.
  6. 6 . The method of claim 1 , wherein the one or more dynamic lighting scenarios comprises one or more nighttime scenes in urban streets, country roads, or indoor settings, simulated with varying degrees of ambient light.
  7. 7 . The method of claim 1 , wherein the one or more dynamic lighting scenarios comprises one or more low-light environments selected from the group consisting of: dimly lit parking garages, moonlit landscapes and twilight settings.
  8. 8 . The method of claim 1 , wherein simulating the one or more dynamic lighting scenarios comprises varying direction and intensity of light from one or more light sources hitting an eye, wherein the light causes a clouding effect leading to a glare.
  9. 9 . The method of claim 1 , wherein simulating the one or more dynamic lighting scenarios comprises simulating one or more scenarios for assessing the ability to distinguish between different shades of gray comprising changing optotype direction, whereby after solid black light, solid black is changed to a level of gray having a different gray level closer to white, including smoothing to lessen pixelation.
  10. 10 . The method of claim 1 , further comprising, prior to simulating the one or more dynamic lighting scenarios in the VR user interface: providing a visual stimulus in the VR user interface to measure a user's susceptibility level to motion sickness; and in accordance with a determination that the user's susceptibility to motion sickness is above a predetermined threshold, reducing a refresh rate of the VR user interface.
  11. 11 . The method of claim 1 , wherein tracking the eye movements and response times comprises tracking eyeball position in relation to light sensitivity while using a light source to cause glare.
  12. 12 . The method of claim 1 , wherein evaluating the user response based on the eye movements and the response times for testing night vision and glare sensitivity comprises tracking a response time to adapt to changes in lighting conditions as light is decreased.
  13. 13 . The method of claim 1 , wherein evaluating the user response based on the eye movements and the response times for testing night vision and glare sensitivity comprises tracking a focus on a glare as light is decreased.
  14. 14 . The method of claim 1 , wherein evaluating the user response based on the eye movements and the response times for testing night vision and glare sensitivity comprises measuring visual acuity under varying light conditions using tests comprising dynamic Snellen charts.
  15. 15 . The method of claim 1 , wherein evaluating the user response based on the eye movements and the response times for testing night vision and glare sensitivity comprises assessing an ability to distinguish between different shades of gray in low-light scenarios.
  16. 16 . The method of claim 1 , wherein evaluating the user response based on the eye movements and the response times for testing night vision and glare sensitivity comprises measuring a time taken for a vision to return to baseline or normal vision after exposure to a bright light.
  17. 17 . The method of claim 1 , wherein evaluating the user response based on the eye movements and the response times for testing night vision and glare sensitivity comprises collecting data on reaction times, accuracy of task completion, eye movement patterns, and recovery times from glare.
  18. 18 . An electronic device, comprising: an HMD and a camera; one or more processors; and memory for storing one or more programs for execution by the one or more processors, the one or more programs including instructions for, at an electronic device including a head-mounted display (HMD) and a camera: generating a virtual reality (VR) user interface corresponding to a photorealistic virtual environment; rendering the VR user interface on the HMD; simulating one or more dynamic lighting scenarios in the VR user interface, the one or more dynamic lighting scenarios comprises one or more nighttime scenes in urban streets, country roads, or indoor settings, simulated with varying degrees of ambient light; and while simulating the one or more dynamic lighting scenarios, in real time: continuously tracking, using the camera, eye movements and response times in response to visual stimuli presented in the one or more dynamic lighting scenarios; and evaluating user response based on the eye movements and the response times for testing night vision and glare sensitivity.
  19. 19 . A method of implementing a virtual vision test for night vision and glare sensitivity, comprising: at an electronic device including a head-mounted display (HMD) and a camera: generating a virtual reality (VR) user interface corresponding to a photorealistic virtual environment; rendering the VR user interface on the HMD; simulating one or more dynamic lighting scenarios in the VR user interface; while simulating the one or more dynamic lighting scenarios, in real time: continuously tracking, using the camera, eye movements and response times in response to visual stimuli presented in the one or more dynamic lighting scenarios; and evaluating user response based on the eye movements and the response times for testing night vision and glare sensitivity; and prior to simulating the one or more dynamic lighting scenarios in the VR user interface, (i) providing a visual stimulus in the VR user interface to measure a user's susceptibility level to motion sickness and (ii) in accordance with a determination that the user's susceptibility to motion sickness is above a predetermined threshold, reducing a refresh rate of the VR user interface.
  20. 20 . A method of implementing a virtual vision test for night vision and glare sensitivity, comprising: at an electronic device including a head-mounted display (HMD) and a camera: generating a virtual reality (VR) user interface corresponding to a photorealistic virtual environment; rendering the VR user interface on the HMD; simulating one or more dynamic lighting scenarios in the VR user interface; and while simulating the one or more dynamic lighting scenarios, in real time: continuously tracking, using the camera, eye movements and response times in response to visual stimuli presented in the one or more dynamic lighting scenarios; and evaluating user response based on the eye movements and the response times for testing night vision and glare sensitivity by collecting data on reaction times, accuracy of task completion, eye movement patterns, and recovery times from glare.

Description

TECHNICAL FIELD The present inventions relate to vision test technology. More specifically, methods, systems, devices, and non-statutory computer-readable storage media are applied to implement vision testing in an extended reality environment. BACKGROUND Traditional visual assessment methods have been the cornerstone of evaluating eye health and vision for many years. These methods are typically conducted in clinical environments, where specialized equipment and standardized procedures are used to ensure accurate and reliable results. The parameters for these assessments are generally fixed, reflecting the controlled nature of the clinical setting. Over time, these techniques have become the accepted standard for diagnosing and monitoring visual conditions, forming the basis of routine eye care practices in medical offices, hospitals, and specialized eye care facilities. Despite their widespread use, these methods have traditionally been limited to professional settings, where they can be conducted under the supervision of trained healthcare providers using dedicated equipment. SUMMARY The present disclosure relates to innovative methods and systems that can revolutionize vision care, making vision testing and other exams more accessible and affordable for patients. Additionally, it is contemplated that the principles and features of the present disclosure can be implemented in numerous other applications of display technology, including headsets, heads-up displays, and other micro-displays (e.g., microLED and microOLED) to address challenges and limitations inherent in such products and their uses. In accordance with at least some embodiments disclosed herein is the realization that traditional methods for visual assessment do not allow for dynamic adjustment of test parameters, leading to less accurate assessments, nor can they be implemented to test eyes and vision at home using household devices in a consistent and environment-locked manner. Some embodiments are directed to a method of implementing a virtual vision test at an electronic device including a head-mounted display (HMD) and a camera. The method includes executing a user application configured to enable the virtual vision test; generating a virtual reality (VR) user interface corresponding to a three-dimensional (3D) virtual environment; focusing the camera on an eye area of a user wearing the electronic device; displaying, on the user interface, a visual stimulus corresponding to the virtual vision test; while displaying the visual stimulus, in real time, capturing a sequence of eye images using the camera of the electronic device; determining eye movement information including a temporal sequence of eyeball positions based on the sequence of eye images; and comparing the visual stimulus and the eye movement information to determine an eye health condition. In some embodiments, a user application can be implemented by a head-mounted display device (HDD) configured to create a customized extended reality (XR) environment for a user engaged on an XR information platform. Products may be rendered for the user in a three-dimension format in the XR environment, thereby facilitating eyewear selection and fitting. The XR can be an umbrella term encapsulating Augmented Reality (AR), Virtual Reality (VR), Mixed Reality (MR), and everything in between. In this application, any embodiments that apply a VR system can be implemented using an AR or MR system as well. Some embodiments are directed to a method of implementing a virtual vision test for evaluating night vision and glare sensitivity. The method is performed at an electronic device including a HMD and a camera. The method includes generating a virtual reality (VR) user interface corresponding to a photorealistic virtual environment. The method also includes rendering the VR user interface on the HMD. The method also includes simulating one or more dynamic lighting scenarios in the VR user interface. The method also includes, while simulating the one or more dynamic lighting scenarios, in real time: continuously tracking, using the camera, eye movements and response times in response to visual stimuli presented in the one or more dynamic lighting scenarios; and evaluating user response based on the eye movements and the response times for testing night vision and glare sensitivity. Some embodiments are directed to a method of implementing a virtual vision test for measuring pupil reaction to light changes and visual imperfections in virtual environments. The method is performed at an electronic device including a head-mounted display and a camera. The method includes generating a virtual reality (VR) user interface corresponding to a photorealistic virtual environment. The method also includes rendering the VR user interface on the HMD. The method also includes simulating one or more dynamic lighting scenarios in the VR user interface. The method also includes, while simulating the one or more