JP-7857387-B2 - Eyepiece for augmented reality display systems
Inventors
- サマース バーガバ
- ビクター カイ リウ
- ケビン メッサー
Assignees
- マジック リープ, インコーポレイテッド
Dates
- Publication Date
- 20260512
- Application Date
- 20241219
- Priority Date
- 20171215
Claims (20)
- An eyepiece waveguide for an augmented reality display system, wherein the eyepiece waveguide is An optically transparent substrate having a first surface and a second surface, A first input coupling grid (ICG) region formed on or within one surface of the optically transparent substrate, wherein the first ICG region is configured to receive an input beam of light and couple the input beam of light into the optically transparent substrate as a guided beam, A multidirectional pupil expander (MPE) region formed on or within the first surface of the optically transparent substrate, wherein the MPE region has a plurality of diffraction features exhibiting periodicity along at least a first periodic axis and a second periodic axis, and the MPE region is positioned to receive the guided beam from the first ICG region, diffract it in a plurality of directions, and create a plurality of diffracted beams, An eyepiece waveguide comprising: an exit pupil expander (EPE) region formed on or within the second surface of the optically transparent substrate, wherein the EPE region is configured to externally couple a pair of the plurality of diffracted beams from the optically transparent substrate as output beams propagating along parallel paths, thereby causing a first portion of the image to appear as if it originates from optical infinity , and to externally couple other diffracted beams from the optically transparent substrate as output divergent beams propagating along divergent paths .
- The eyepiece waveguide according to claim 1, wherein the MPE region and the EPE region partially overlap.
- The eyepiece waveguide according to claim 1, wherein the MPE region and the EPE region are substantially equal in size.
- The eyepiece waveguide according to claim 3, wherein the MPE region and the EPE region are mutually aligned.
- The eyepiece waveguide according to claim 1, wherein the first ICG region comprises a diffraction grating having a plurality of periodically repeating lines, and the EPE region comprises a diffraction grating having a plurality of periodically repeating lines oriented perpendicularly to the plurality of periodically repeating lines of the diffraction grating in the first ICG region.
- The eyepiece waveguide according to claim 1, wherein the MPE region comprises a two-dimensional lattice pattern of distinct diffraction features.
- The eyepiece waveguide according to claim 1, wherein the MPE region comprises a cross-grid.
- The eyepiece waveguide according to claim 1, wherein the MPE region is configured to create the plurality of diffracted beams by diffracting a portion of the power of the guided beam from the first ICG region in at least four directions.
- The eyepiece waveguide according to claim 8, wherein one of the four directions corresponds to a zero-order diffracted beam.
- The eyepiece waveguide according to claim 8, wherein three or more of the four directions correspond to the primary diffraction beam.
- The eyepiece waveguide according to claim 8, wherein the four directions are separated by 90 degrees angularly.
- The eyepiece waveguide according to claim 1 , wherein the MPE region is further configured to increase the number of diffracted beams by diffracting the diffracted beams that have been initially diffracted and are still propagating within the MPE region in the plurality of directions at a plurality of dispersion locations.
- The eyepiece waveguide according to claim 1, wherein the first periodic axis and the second periodic axis in the diffraction features of the MPE region are not orthogonal.
- The diffraction efficiency of the diffraction features in the MPE region is spatially variable, as described in claim 1 of the eyepiece waveguide.
- The eyepiece waveguide according to claim 14, wherein the diffraction features located within the MPE region, closer to the first ICG region, have a higher diffraction efficiency.
- The eyepiece waveguide according to claim 14, wherein diffraction features located within the MPE region, closer to the axis along which the first ICG region directs the guided beam, have a higher diffraction efficiency.
- The eyepiece waveguide according to claim 1, further comprising one or more additional ICG regions provided at one or more corresponding locations around the MPE region, providing one or more corresponding additional light input beams incident on the MPE region at different locations.
- The eyepiece waveguide according to claim 1, wherein the diffraction efficiency of the diffraction features within the EPE region is spatially variable.
- The eyepiece waveguide according to claim 18, wherein the diffraction features located closer to the periphery of the EPE region have a higher diffraction efficiency.
- The eyepiece waveguide according to claim 1, further comprising one or more diffraction mirrors located around the periphery of the optically transparent substrate.
Description
(Incorporation by reference of all priority applications) This application claims priority to U.S. Provisional Patent Application No. 62/599663, filed on 15 December 2017 and titled "EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM", U.S. Provisional Patent Application No. 62/608555, filed on 20 December 2017 and titled "EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM", and U.S. Provisional Patent Application No. 62/620465, filed on 22 January 2018 and titled "EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM". Any application in the application data sheet in which a foreign or domestic priority claim is identified above and/or filed together with this application is incorporated herein by reference under 37 CFR 1.57. (Field) This disclosure relates to eyepieces for virtual reality, augmented reality, and mixed reality systems. (Description of related applications) Modern computing and display technologies are driving the development of virtual reality, augmented reality, and mixed reality systems. Virtual reality, or "VR," systems create simulated environments for users to experience. This can be done by presenting computer-generated image data to the user through a head-mounted display. This image data creates a sensory experience, immersing the user in the simulated environment. Virtual reality scenarios typically involve the presentation of computer-generated image data only, without also including actual real-world image data. Augmented reality systems generally complement the real-world environment with simulated elements. For example, an augmented reality (AR) system can provide a user with a view of the surrounding real-world environment via a head-mounted display. However, computer-generated image data can also be presented on the display to enhance the real-world environment. This computer-generated image data can include elements that are contextually relevant to the real-world environment. Such elements may include simulated text, images, objects, etc. Mixed reality (MR) systems are a type of AR system that also introduces simulated objects into the real-world environment, but these objects are typically characterized by a greater degree of interaction. Simulated elements can often be interactive in real time. Figure 1 depicts an exemplary AR/MR scene 1, where the user sees a real-world park setting 6 featuring people, trees, buildings in the background, and a concrete platform 20. In addition to these items, computer-generated image data is also presented to the user. This computer-generated image data could include, for example, a robot figure 10 standing on the real-world platform 20 and a flying cartoonish avatar character 2 that appears to be a personification of a bumblebee, although these elements 2 and 10 do not actually exist in the real-world environment. Figure 1 illustrates the user's view of an augmented reality (AR) scene through an AR device. Figure 2 illustrates an embodiment of a wearable display system. Figure 3 illustrates a conventional display system for simulating three-dimensional images for the user. Figure 4 illustrates aspects of an approach to simulating a three-dimensional image using multiple depth planes. Figures 5A-5C illustrate the relationship between the radius of curvature and the focal radius. Figure 6 illustrates an example of a waveguide stack for outputting image information to the user within an AR eyepiece. Figures 7A-7B illustrate an example of an output beam generated by a waveguide. Figure 8 illustrates an embodiment of a stacked waveguide assembly, where each depth plane includes an image formed using multiple different primary colors. Figure 9A shows a cross-sectional side view of an embodiment of a stacked waveguide set, each including an internally coupled optical element. Figure 9B shows a perspective view of an embodiment of the stacked waveguides shown in Figure 9A. Figure 9C shows top and bottom plan views of the embodiment of the multiple stacked waveguides shown in Figures 9A and 9B. Figure 10 is a perspective view of an exemplary AR eyepiece waveguide stack. Figure 11 is a cross-sectional view of a portion of an exemplary eyepiece waveguide stack, with an edge sealing structure for supporting the eyepiece waveguides in a stacked configuration. Figures 12A and 12B illustrate the top view of the eyepiece waveguide during operation when projecting an image toward the user's eye.Figures 12A and 12B illustrate the top view of the eyepiece waveguide during operation when projecting an image toward the user's eye. Figure 13A illustrates a k-vector, which can be used to represent the propagation direction of a ray or beam of light. Figure 13B illustrates the light rays within a planar waveguide. Figure 13C illustrates the acceptable k-vector for light of a given angular frequency ω propagating in a non-boundary homogeneous medium with refractive index n. Figure 13D illustrates the allowable k-vector for light of a given angular frequency ω propagating thr