JP-7857470-B2 - Eyepiece for augmented reality display systems
Inventors
- サマース バーガバ
- ビクター カイ リウ
- ケビン メッサー
Assignees
- マジック リープ, インコーポレイテッド
Dates
- Publication Date
- 20260512
- Application Date
- 20250410
- Priority Date
- 20181120
Claims (14)
- An eyepiece waveguide for an augmented reality display system, wherein the eyepiece waveguide is An optically transparent substrate having a first surface and a second surface, An input coupling grid (ICG) region formed on or within one of the surfaces of the substrate, wherein the ICG region is configured to receive an input beam of light and couple the input beam into the substrate as a guided beam, A first combined pupil expander-extractor (CPE) lattice region formed on or within the first surface of the substrate, wherein the first CPE lattice region is positioned to receive the guided beam from the ICG region, create a first plurality of diffracted beams at a plurality of dispersed locations, and externally couple a first plurality of output beams, and the first CPE lattice region is a first two-dimensional (2D) CPE lattice region, An eyepiece waveguide comprising a second CPE grating region formed on or within the second surface of the substrate, wherein the second CPE grating region is positioned to receive the guided beam from the ICG region, create a plurality of second diffracted beams at a plurality of dispersed locations, and externally couple a plurality of second output beams, and the second CPE grating is a second 2D CPE grating region.
- The eyepiece waveguide according to claim 1, wherein the first CPE grating region is configured to externally couple the second plurality of diffraction beams, and the second CPE grating region is configured to externally couple the first plurality of diffraction beams.
- The eyepiece waveguide according to claim 2, wherein the first and second plurality of diffracted beams alternately interact with the first and second CPE grating regions.
- The eyepiece waveguide according to claim 1, wherein the first and second CPE grating regions overlap by at least 90%.
- The eyepiece waveguide according to claim 1, wherein the first and second CPE grid regions are of the same size.
- The eyepiece waveguide according to claim 5, wherein the first and second CPE grid regions are matched to each other.
- The eyepiece waveguide according to claim 1, wherein the first CPE grating region is configured to create the first plurality of diffracted beams by diffracting a portion of the refractive power of the guided beam from the ICG region in at least two directions.
- The eyepiece waveguide according to claim 7, wherein one of the two directions corresponds to a zero-order diffracted beam.
- The eyepiece waveguide according to claim 1, wherein the second CPE grating region is configured to create the second plurality of diffracted beams by diffracting a portion of the refractive power of the guided beam from the ICG region in at least two directions.
- The eyepiece waveguide according to claim 9, wherein one of the two directions corresponds to a zero-order diffracted beam.
- The eyepiece waveguide according to claim 1, wherein the first plurality of diffracted beams propagate in a first direction, and the second plurality of diffracted beams propagate in a second direction at a substantially 60° angle with respect to the first direction.
- The eyepiece waveguide according to claim 1, wherein the input beam is collimated and has a diameter of 5 mm or less.
- The optically transparent substrate is planar, as described in claim 1, for the eyepiece waveguide.
- The eyepiece waveguide according to claim 1, wherein the eyepiece waveguide is incorporated into an eyepiece for an augmented reality display system.
Description
(Incorporation by reference to any priority claim) This application claims priority to U.S. Provisional Patent Application 62/769,933, filed on November 20, 2018, and titled "EYEPIECES FOR AUGMENTED REALITY DISPLAY SYSTEM." The aforementioned application and any other applications, in which foreign or domestic priority claims are identified in the application data sheet filed together with this application, are incorporated herein by reference under 37 CFR 1.57. This disclosure relates to eyepieces for virtual reality, augmented reality, and mixed reality systems. Modern computing and display technologies are driving the development of virtual reality, augmented reality, and mixed reality systems. Virtual reality, or "VR," systems create simulated environments for users to experience. This can be achieved by presenting computer-generated image data to the user through a head-mounted display. This image data creates a sensory experience, immersing the user in the simulated environment. Virtual reality scenarios typically involve the presentation of computer-generated image data only, without also including actual real-world image data. Augmented reality systems generally complement the real-world environment with simulated elements. For example, augmented reality, or "AR," systems can provide users with a view of their surrounding real-world environment via a head-mounted display. However, computer-generated image data can also be presented on the display to enhance the real-world environment. This computer-generated image data can include elements that are contextually relevant to the real-world environment. Such elements may include simulated text, images, objects, etc. Mixed reality, or "MR," is a type of AR system that also introduces simulated objects into the real-world environment, but these objects are typically characterized by a greater degree of interaction. Simulated elements can often be interactive in real time. Figure 1 depicts an exemplary AR scene 1, where the user sees a real-world park setting 6 featuring people, trees, buildings in the background, and a concrete platform 20. In addition to these items, computer-generated image data is also presented to the user. This computer-generated image data could include, for example, a robot figure 10 standing on the real-world platform 20 and a flying, cartoonish avatar character 2 that appears to be a personification of a bumblebee, although these elements 2 and 10 do not actually exist in the real-world environment. Figure 1 illustrates the user's view of an augmented reality (AR) scene through an AR device. Figure 2 illustrates an embodiment of a wearable display system. Figure 3 illustrates a conventional display system for simulating three-dimensional image data for the user. Figure 4 illustrates aspects of an approach to simulating three-dimensional image data using multiple depth planes. Figures 5A-5C illustrate the relationship between the radius of curvature and the focal radius. Figure 6 illustrates an example of a waveguide stack for outputting image information to the user within an AR eyepiece. Figures 7A-7B illustrate an example of an output beam generated by a waveguide. Figure 8 illustrates an embodiment of a stacked waveguide assembly, where each depth plane includes an image formed using multiple different primary colors. Figure 9A shows cross-sectional side views of embodiments of stacked waveguide sets, each including an internally coupled optical element. Figure 9B shows a perspective view of an embodiment of the multiple stacked waveguides shown in Figure 9A. Figure 9C shows top and bottom plan views of the embodiment of the multiple stacked waveguides shown in Figures 9A and 9B. Figure 10 is a perspective view of an exemplary AR eyepiece waveguide stack. Figure 11 is a cross-sectional view of a portion of an exemplary eyepiece waveguide stack, with an edge sealing structure for supporting the eyepiece waveguides in a stacked configuration. Figures 12A and 12B illustrate the top view of the eyepiece waveguide during operation when projecting an image toward the user's eye.Figures 12A and 12B illustrate the top view of the eyepiece waveguide during operation when projecting an image toward the user's eye. Figure 13A illustrates a k-vector, which can be used to represent the propagation direction of a ray or beam of light. Figure 13B illustrates the light rays within a planar waveguide. Figure 13C illustrates the acceptable k-vector for light of a given angular frequency ω propagating in a non-boundary homogeneous medium with refractive index n. Figure 13D illustrates the allowable k-vector for light of a given angular frequency ω propagating through a homogeneous planar waveguide medium with refractive index n. Figure 13E illustrates a ring in k-space corresponding to the k-vector of a light wave that can be induced in a waveguide having refractive index n² . Figure 13F shows a k-space schematic and an eyepiece waveguide illus