Search

EP-4235391-B1 - VIRTUAL, AUGMENTED, AND MIXED REALITY SYSTEMS AND METHODS

EP4235391B1EP 4235391 B1EP4235391 B1EP 4235391B1EP-4235391-B1

Inventors

  • RODRIGUEZ, JOSE FELIX
  • PEREZ, RICARDO MARTINEZ

Dates

Publication Date
20260506
Application Date
20170822

Claims (13)

  1. A method (4000) in a virtual, augmented, or mixed reality system (80), the method comprising: the system operating in a first power mode (4002) corresponding to a first processor mode; the system receiving a request for a second processor mode (4004); the system switching to a second power mode (4006) corresponding to the second processor mode from the first power mode in response to receiving the request for the second processor mode; the system receiving an indicator of acceptability of the first processor mode (4008); and the system switching to the first power mode from the second power mode (4010) in response to receiving the indicator of acceptability of the first processor mode, wherein the first power mode is a low power mode, wherein the first processor mode is a low processor mode, such that the system operates in the low power mode corresponding to the low processor mode, wherein the second power mode is a normal power mode, and wherein the second processor mode is a normal processor mode, such that: the system receives a request for the normal processor mode (4004); the system switches to the normal power mode (4006) corresponding to the normal processor mode from the low power mode in response to receiving the request for the normal processor mode; the system receives the indicator of acceptability of the low processor mode (4008); and the system switches to the low power mode from the normal power mode (4010) in response to receiving the indicator of acceptability of the low processor mode, wherein the system receiving the request for the normal processor mode (4004) comprises receiving the request for the normal processor mode through a low latency communication channel.
  2. The method (4000) of Claim 1, wherein the low power mode comprises a system component is switched off or in a standby mode with a fast wake-up function, and wherein the system switching to the normal power mode from the low power mode comprises the system activating the system component that was previously switched off or in a standby mode.
  3. The method (4000) of Claims 1 or 2, wherein the request for the normal processor mode is generated in response to a user's pose changing more than a predetermined threshold amount.
  4. The method (4000) of any of Claims 1-3, wherein the indicator of acceptability of the low processor mode is a user's pose changing less than a predetermined threshold amount in a predetermined time.
  5. A method (4100) in a virtual, augmented, or mixed reality system, the method comprising the system operating in a first power mode (4102) corresponding to a first processor mode; the system receiving a request for a second processor mode (4104); the system switching to a second power mode (4106) corresponding to the second processor mode from the first power mode in response to receiving the request for the second processor mode; the system receiving an indicator of acceptability of the first processor mode (4108); and the system switching to the first power mode from the second power mode (4110) in response to receiving the indicator of acceptability of the first processor mode, wherein the first power mode is a normal power mode, wherein the first processor mode is a normal processor mode, such that the system operates in the normal power mode corresponding to the normal processor mode, wherein the second power mode is a high power mode, and wherein the second processor mode is a high processor mode, such that: the system receives a request for a high processor mode (4104); the system switches to a high power mode (4106) corresponding to the high processor mode from the normal power mode in response to receiving the request for the high processor mode; the system receives the indicator of acceptability of the normal processor mode (4108); and the system switches to the normal power mode from the high power mode (4110) in response to receiving the indicator of acceptability of the normal processor mode, wherein the request for the high processor mode is generated in response to a request to render more than a predetermined threshold amount of virtual objects.
  6. The method (4100) of Claim 5, wherein the high power mode comprises an increased amount of current available to the system, and wherein the system switching to the normal power mode from the high power mode (4110) comprises the system reducing the amount of current available to the system.
  7. The method (4100) of Claims 5 or 6, wherein the indicator of acceptability the normal processor mode is a request to render less than a predetermined threshold amount of virtual objects for a predetermined time.
  8. A method (4200) in a virtual, augmented, or mixed reality system, the method comprising the system operating in a first power mode (4202) corresponding to a first processor mode; the system receiving a request for a second processor mode (4204); the system switching to a second power mode (4206) corresponding to the second processor mode from the first power mode in response to receiving the request for the second processor mode; the system receiving an indicator of acceptability of the first processor mode (4208); and the system switching to the first power mode (4210) from the second power mode in response to receiving the indicator of acceptability of the first processor mode, wherein the first power mode is a multiplane power mode, wherein the first processor mode is a multiplane processor mode, in which the system renders and projects images on a plurality of depth planes, such that the system operates in the multiplane power mode corresponding to the multiplane processor mode; wherein the second power mode is a single plane power mode, and wherein the second processor mode is a single plane processor mode, in which the system renders and projects images on a single depth plane, such that: the system receives a request for the single plane processor mode when the system receives an indicator of single plane activity; the system switches to the single plane power mode corresponding to the single plane processor mode from the multiplane power mode in response to receiving the indicator of single plane activity; the system receives the indicator of acceptability of the multiplane processor mode (4208) when the system receives an indicator of multiplane activity; and the system switches to the multiplane power mode from the single plane power mode in response to receiving the indicator of multiplane activity.
  9. The method (4200) of Claim 8, wherein the indicator of single plane activity comprises a user requesting a movie to be displayed on a virtual screen, the user opening a 2D application, or sensor data indicating that the user's gaze is converging to a particular plane for a predetermined threshold amount of time, the method further comprising switching between a discrete imaging mode and a multiplane imaging mode during a blink or an eye movement.
  10. The method (4200) of Claim 8 or 9, wherein the indicator of multiplane activity comprises a user requesting that a movie currently displayed on a virtual screen be halted, or sensor data indicating that the user's gaze is converging away from a particular plane for a predetermined threshold amount of time.
  11. A method (4200) in a virtual, augmented, or mixed reality system, the method comprising the system operating in a first power mode (4202) corresponding to a first processor mode; the system receiving a request for a second processor mode (4204'); the system switching to a second power mode (4206) corresponding to the second processor mode from the first power mode in response to receiving the request for the second processor mode; the system receiving an indicator of acceptability of the first processor mode (4208'); and the system switching to the first power mode (4210) from the second power mode in response to receiving the indicator of acceptability of the first processor mode, wherein the first power mode is a multiplane power mode, wherein the first processor mode is a multiplane processor mode, in which the system renders and projects images on a plurality of depth planes, such that the system operates in the multiplane power mode corresponding to the multiplane processor mode; wherein the second power mode is a single plane power mode, and wherein the second processor mode is a single plane processor mode, in which the system renders and projects images on a single depth plane, such that: the system receives a request for the single plane processor mode when the system receives an indicator of the system reaching a predetermined threshold; the system switches to the single plane power mode corresponding to the single plane processor mode from a multiplane imaging mode in response to receiving the indicator of the system reaching a predetermined threshold (4204'); the system receives the indicator of acceptability of the multiplane processor mode when the system receives an indicator of normal system operation; and the system switches to the multiplane power mode from the single plane power mode in response to receiving the indicator of normal system operation.
  12. The method (4200) of Claim 11, wherein the predetermined threshold comprises a temperature threshold or a battery power remaining threshold, and wherein the indicator of normal system operation comprises having no system characteristic indicating a temperature or a battery power reaching a threshold.
  13. The method (4200) of claims 11 or 12, further comprising switching between a discrete imaging mode and the multiplane imaging mode during a blink or an eye movement.

Description

Field of the Invention The present disclosure relates to virtual reality, augmented reality, and mixed reality imaging, visualization, and display systems and methods. Background Modern computing and display technologies have facilitated the development of virtual reality ("VR"), augmented reality ("AR"), and mixed reality ("MR") systems. VR systems create a simulated environment for a user to experience. This can be done by presenting computer-generated imagery to the user through a head-mounted display. This imagery creates a sensory experience which immerses the user in the simulated environment. A VR scenario typically involves presentation of only computer-generated imagery rather than also including actual real-world imagery. AR systems generally supplement a real-world environment with simulated elements. For example, AR systems may provide a user with a view of the surrounding real-world environment via a head-mounted display. However, computer-generated imagery can also be presented on the display to enhance the real-world environment. This computer-generated imagery can include elements which are contextually-related to the real-world environment. Such elements can include simulated text, images, objects, etc. MR systems also introduce simulated objects into a real-world environment, but these objects typically feature a greater degree of interactivity than in AR systems. The simulated elements can often times be interactive in real time. Figure 1 depicts an example AR/MR scene 1 where a user sees a real-world park setting 6 featuring people, trees, buildings in the background, and a concrete platform 20. In addition to these items, computer-generated imagery is also presented to the user. The computer-generated imagery can include, for example, a robot statue 10 standing upon the real-world platform 20, and a cartoon-like avatar character 2 flying by which seems to be a personification of a bumble bee, even though these elements 2, 10 are not actually present in the real-world environment. Various optical systems generate images at various depths for displaying VR, AR, or MR scenarios. Some such optical systems are described in U.S. Utility Patent Application Publication No. 2016/032884. Other such optical systems for displaying MR experiences are described in U.S. Utility Patent Application Publication No. 2019/0094981, as though set forth in full. Because the human visual perception system is complex, it is challenging to produce a VR/AR/MR technology that facilitates a comfortable, natural-feeling, rich presentation of virtual image elements amongst other virtual or real-world imagery elements. Improved techniques are needed for processing image data in such systems, including, for example, techniques for providing control data to control how the image data is displayed, techniques for correcting optical distortions in the image data, techniques for displaying and blending image data from many depth planes, and techniques for warping image data based on the head pose of a user. VR/AR/MR technology also has size and portability issues, battery life issues, system over-heating issues, and other system and optical challenges. Improved techniques are needed for addressing these issues, including, for example, overheat cause identification, time domain power management, discrete imaging mode, and eye/gaze tracking based rendering modification. The systems and methods described herein are configured to address these and other challenges. JP 2009-105593 A1 discloses a head-mounted display device with a power management capable of preventing an unintended shutdown operation by giving a warning to a user in advance. While this provides technical advances, improvements remain desirable. What is needed is a technique or techniques to improve over legacy techniques and/or over other considered approaches. Some of the approaches described in this background section are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued. Summary The invention directed to methods according to claims 1, 5, 8 and 11. Further developments of the invention are according to dependent claims 2-4, 6, 7, 9, 10, 12 and 13. Brief Description of the Drawings The drawings described below are for illustration purposes only. The drawings are not intended to limit the scope of the present disclosure. The drawings illustrate the design and utility of various embodiments of the present disclosure. It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are represented by like reference numerals throughout the figures. In order to better appreciate how to obtain the recited and other advantages and objects of various embodiments of the disclosure, a more detailed description of the present disclosure will be rendered by reference to specific embodiments thereof, which are illustrated in the accompanying drawings. Understanding