Search

EP-4405736-B1 - COMBINED BIREFRINGENT MATERIAL AND REFLECTIVE WAVEGUIDE FOR MULTIPLE FOCAL PLANES IN A MIXED-REALITY HEAD-MOUNTED DISPLAY DEVICE AND RELATIVE METHOD

EP4405736B1EP 4405736 B1EP4405736 B1EP 4405736B1EP-4405736-B1

Inventors

  • HELD, ROBERT THOMAS
  • KRESS, Bernard Charles
  • SAULSBURY, ASHLEY
  • RESHIDKO, DMITRY

Dates

Publication Date
20260506
Application Date
20220728

Claims (15)

  1. A method (1700) for operating an electronic device (100) that includes a mixed-reality see-through optical display system configured for showing mixed-reality scenes comprising virtual images of virtual-world objects (225) that are rendered over views of real-world objects (605) to a user (115) of the electronic device (100), the method comprising: receiving light for the virtual images, the light being linearly polarized in a first polarization state (1705); operating a ferroelectric liquid crystal, FLC, modulator to switch between the first polarization state for the virtual image light and a second polarization state that is orthogonal to the first polarization state (1710); providing a lens of birefringent material upon which virtual image light is incident in either the first polarization state or second polarization state, in which the lens provides one of two different focal distances for the virtual images depending on polarization state of the incident virtual image light (1715); in-coupling the virtual image light from the lens into the mixed-reality see-through optical display system which renders the virtual images at the one of two different focal distances to the user (1720); characterised by stacking combinations of FLC modulators and lenses of birefringent material that act on the received virtual image light in series, in which each combination in the stack provides two unique focal distances for the rendered virtual images.
  2. The method of claim 1 further comprising operating the FLC modulator at a rate that is synchronized to a refresh rate of the received virtual image light to provide a temporally multiplexed virtual image display comprising one or more virtual images located at either one or the other different focal distances or located at both of the different focal distances simultaneously.
  3. The method of claim 1 further comprising operating the FLC modulator according to a composition of a mixed-reality scene, in which the composed mixed-reality scene includes virtual-world objects that are located at different focal distances.
  4. The method of any preceding claim wherein the virtual image is generated by a virtual image source which is located on the electronic device outside the user's line of sight.
  5. A head-mounted display, HMD, device (100) wearable by a user (115) and configured for supporting a mixed-reality experience including viewing, by the user (115), of virtual images (725) that are combined with views of real-world objects (605) in a physical world, comprising: a focal-distance modulation system that is operable to receive virtual images (725) from a virtual image source (725), the focal-distance modulation system comprising a polarization modulator (705) and a birefringent lens (710), wherein the polarization modulator (705) is configured to selectively switch polarization of the virtual images (725) between two orthogonal states, and wherein the birefringent lens (710) has two different refractive indices each with sensitivity to a different orthogonal state of polarization of virtual images (725), wherein virtual images (725) in a first polarization state are focused by the birefringent lens (710) at a first focal distance ( d1 ) , and wherein virtual images (725) in a second polarization state are focused by the birefringent lens (710) at a second focal distance (d2), wherein the focal-distance modulation system further comprises at least an additional polarization modulator and an additional birefringent lens wherein a total of N polarization modulator/birefringent lens pairs are utilized to provide 2 N different focal distances; and an optical combiner (750) with which the user (115) can see the real-world objects (605) and the virtual images (740) in a mixed-reality scene, the optical combiner (750) including an input coupler (525) configured to in-couple virtual images (725) from the focal-distance modulation system that are focused at either the first or second focal distance ( d1, d2) into the optical combiner (750) and further including an output coupler (530) configured to out-couple the virtual images (740) that are focused at either the first or second focal distance ( d1, d2) from the optical combiner (750) to one or more of the user's (115) eyes.
  6. The HMD device of claim 5 further comprising a linear polarizing filter that is arranged to linearly polarize light from the virtual image source.
  7. The HMD device of claim 5 further comprising an eye tracker for tracking vergence of the user's eyes or tracking a gaze direction of at least one eye of the user to perform one of: calibration of alignment between the user's eye and the optical combiner, dynamic determination of whether alignment changes during use of the HMD device, or composition of a mixed-reality scene at the virtual image source.
  8. The HMD device of claim 7, wherein the eye tracker performs the composition of the mixed-reality scene, comprising the eye tracker rendering virtual images in a single focal plane that is selected based on operation of the eye tracker to determine a gaze point of the user.
  9. The HMD device of claim 8 further comprising a focal plane controller operatively coupled to the polarization modulator and configured to selectively switch the polarization state of the virtual images at a rate that is synchronized with a refresh rate of the virtual image source to generate virtual images at different focal distances in the mixed-reality scene supported by the optical combiner.
  10. The HMD device of claim 9 in which the focal plane controller is further operatively coupled to the virtual image source and configured to selectively switch the polarization state of the virtual images based on a composition of a mixed-reality scene generated at the virtual image source.
  11. The HMD device of claim 5 in which the optical combiner comprises a waveguide that is at least partially transparent, the waveguide configured for guiding focused virtual images from the input coupler to the output coupler.
  12. The HMD device of claim 11 in which one or more of the input coupler, output coupler, or waveguide include one or more reflective surfaces.
  13. The HMD device of claim 5 in which the optical combiner is configured to provide an exit pupil that is expanded in one or more directions relative to an input pupil to the optical combiner.
  14. The HMD device of claim 5 in which the polarization modulator comprises one of ferroelectric liquid crystal, FLC, modulator, photo-elastic modulator, electro-optic modulator, magneto-optic modulator, or piezoelectric modulator.
  15. The HMD device of any of claims 5-14 wherein the virtual image source is located on the HMD device outside the user's line of sight.

Description

BACKGROUND Mixed-reality computing devices, such as head-mounted display (HMD) devices may be configured to display information to a user about virtual objects, such as holographic images, and/or real objects in a field of view of the user. For example, an HMD device may be configured to display, using a see-through display system, virtual environments with real-world objects mixed in, or real-world environments with virtual objects mixed in. To view objects clearly, humans must accommodate, or adjust their eyes' focus, to the distance of the object. At the same time, the rotation of both eyes must converge to the object's distance to avoid seeing double images. In natural viewing, vergence and accommodation are linked. When something near is viewed, for example, a housefly close to the nose, the eyes cross and accommodate to a near point. Conversely, something viewed at optical infinity (roughly starting at 6 m or farther for normal vision), the eyes' lines of sight become parallel, and the eyes' lenses accommodate to infinity. In most HMD devices, users will always accommodate to the focal distance of the display to get a sharp image but converge to the distance of the object of interest to get a single image. When users accommodate and converge to different distances, the natural link between the two cues is broken, leading to visual discomfort or fatigue. Tao Zhan et al, "Multifocal displays: review and prospect", PHOTONIX, 30 March 2020, DO1: 10.1186/s43074-020-00010, describes that conventional stereoscopic three-dimensional displays suffer from vergence-accommodation conflict because the stimulus to accommodation is fixed by the display panel and viewing optics, but that to vergence changes with image contents. A multifocal display design and development is provided. A comprehensive classification of numerous potential optical architectures to provide the multiplanar functionality is provided, based on how the information is multiplexed and how the focal planes are generated. The strengths and obstacles of reported or potential designs in each category are analyzed and compared with each other. In addition to enabling optics, the image rendering approaches for the multifocal planes are also described. US 10948712 B2 describes a head-mounted light field display device, the device comprising at least one multiplexed light field display module adapted to face an eye of a viewer wearing the device, the multiplexed light field display module comprising a light field view image generator and a waveguide with a set of shutters, the light field view image generator operable to generate, over time, a set of beams of light from a different one of a set of light field view images, the shuttered waveguide operable to transmit the set of beams and to open, over time, a different subset of the set of shutters, the subset corresponding to a position associated with the view image, thereby to emit the set of beams via the subset, thereby to display to the viewer a time-varying optical light field representative of the set of view images. WO 2018106253 A1 describes an optical display system that includes an information display (image-generating) component, a polarization rotator, a polarization dependent optical element, an input holographic coupler, a light guide and an output holographic coupler. By controlling the polarization of the displayed light through the polarization rotator, the polarization dependent optical element changes the viewable content to different distances from the viewer. This enables the generation of proper light field which will then be coupled into the light guide through the input holographic coupler, and finally go through the output holographic coupler to the user's eye. CN 112051675 A describes a near-to-eye display device, which comprises a display screen for displaying different images in a time-sharing manner; a polarization converter located on the light emitting side of the display screen and used for converting emergent light of different images displayed by the display screen into first circularly polarized light and second circularly polarized light in a time-sharing mode, and the rotating directions of the first circularly polarized light and the second circularly polarized light are opposite; the polarization lens is positioned on one side, deviating from the display screen, of the polarization converter; a focusing lens positioned on one side, deviating from the display screen, of the polarization converter; and a polarizing lens and the focusing lens used for focusing the first circularly polarized light and the second circularly polarized light at different focal lengths, so that different images are focused at different focal lengths. The polarizing lens is used as an imaging element, so that the overall thickness of the near-to-eye display device can be reduced, and convenience is provided for integration and miniaturization. Different images are focused on different foc