Search

US-12620168-B2 - System for blending extended reality images streamed from a plurality of application instances to an extended reality device

US12620168B2US 12620168 B2US12620168 B2US 12620168B2US-12620168-B2

Abstract

An extended reality (XR) streaming method of streaming XR images between a plurality of XR application instances and an XR device is described. The XR streaming method comprises streaming, with a first XR application instance first XR images to the XR device, wherein the first XR images streamed by the first XR application instance are associated with a first XR application. Streaming, by means of a second XR application instance, second XR images to the XR device, wherein the second XR images streamed by the second XR application instance are associated with a second XR application. Blending, by means of the XR device, the first XR images received from the first XR application instance with the second XR images received from the second XR application instance to obtain blended XR images for display on the XR device.

Inventors

  • Philipp Landgraf
  • Alexander Werlberger

Assignees

  • HOLO-LIGHT GMBH

Dates

Publication Date
20260505
Application Date
20230822
Priority Date
20220823

Claims (20)

  1. 1 . An extended reality (XR) streaming method for streaming XR images between a plurality of XR application instances and an XR device, wherein the XR streaming method comprises: streaming, from a first XR application instance of the plurality of XR application instances, first XR images to the XR device, wherein the first XR application instance comprises hardware or a combination of hardware and software that is configured to execute a first XR application, wherein the first XR application is an augmented reality, mixed reality, or virtual reality application, wherein the first XR images are associated with the first XR application; streaming, from a second XR application instance of the plurality of XR application instances, second XR images to the XR device, wherein the second XR application instance comprises hardware or a combination of hardware and software that is configured to execute a second XR application, wherein the second XR application is an augmented reality, mixed reality, or virtual reality application, wherein the second XR images are associated with the second XR application, wherein the first XR application and the second XR application are different from each other; streaming, first depth data from the first XR application instance and second depth data from the second XR application instance to the XR device, wherein the first depth data comprise information of distances of pixels of the first XR images and the second depth data comprise information of distances of pixels of the second XR images from the XR device in a virtual landscape; and blending, by the XR device, the first XR images with the second XR images to obtain blended XR images to be displayed on the XR device, wherein blending comprises prioritizing each pixel of the blended XR images based on comparing the first depth data and the second depth data.
  2. 2 . The XR streaming method of claim 1 , wherein the first XR images received from the first XR application instance are blended with the second XR images received from the second XR application instance according to a positional blending technique that blends the first and second XR images based on an orientation of the XR device and/or based on a location of the XR device.
  3. 3 . The XR streaming method of claim 1 , wherein the plurality of XR application instances have respective priorities and wherein blending the first and second XR images is based on the priorities of the plurality of XR application instances.
  4. 4 . The XR streaming method of claim 3 , further comprising cutting out, with the XR device, those pixels of the first and second XR images having a predefined cutout color, wherein blending comprises, for each of the pixels of the first and second XR images, taking the pixel with a valid value of the XR application instance having the highest priority, wherein cutting out a pixel comprises determining the pixel has an invalid value.
  5. 5 . The XR streaming method of claim 1 , wherein alpha data is streamed to the XR device from the first XR application instance and the second XR application instance, respectively, wherein the alpha data comprises transparency information associated with pixels of the first and second XR images, and wherein the first XR images received from the first XR application instance are blended with the second XR images received from the second XR application instance based on the alpha data.
  6. 6 . The XR streaming method of claim 1 , wherein the first XR application instance employs a first rendering engine, wherein the second XR application instance employs a second rendering engine, and wherein the first rendering engine is different from the second rendering engine.
  7. 7 . The XR streaming method of claim 1 , wherein momentary position data is determined, wherein the momentary position data is associated with a momentary position of the XR device, wherein the momentary position data is forwarded to the first XR application instance and to the second XR application instance, and wherein the first XR application instance and the second XR application instance generate the XR images based on the momentary position data.
  8. 8 . The XR streaming method of claim 7 , wherein the momentary position data is forwarded to the first XR application instance and to the second XR application instance in a synchronized manner.
  9. 9 . The XR streaming method of claim 7 , wherein the momentary position data is associated with a position of at least one camera of the XR device.
  10. 10 . The XR streaming method of claim 1 , wherein the first XR images are reprojected before blending and/or wherein the second XR images are reprojected before blending.
  11. 11 . The XR streaming method of claim 1 , wherein XR image data associated with the first and second XR images comprise information on a view matrix and/or a projection matrix.
  12. 12 . An extended reality (XR) streaming system, comprising: an XR device comprising a projection surface, a camera, communication circuitry, a processor, and one or more memories having stored therein instructions executable by the processor to cause the XR device to, determine momentary position data of the XR device; transmit, via the communication circuitry, the momentary position data to one or more external computer devices implementing a plurality of XR application instances, wherein the plurality of XR application instances comprise corresponding hardware or a combination of corresponding hardware and software that is configured to execute XR applications wherein the XR applications are augmented reality, mixed reality, or virtual reality applications, and wherein the XR application instances of the plurality of XR application instances are different from each other; receive, via the communication circuitry, streams of XR images from the plurality of XR application instances that have rendered the XR images based, at least in part, on the transmitted momentary position data; blend the streamed XR images based, at least in part, on prioritizing each pixel of the blended XR image based on comparing first and second streamed depth data and at least one of a positional blending technique and priorities of the plurality of XR application instances, wherein the first depth data are streamed to the XR device from a first XR application instance of the plurality of XR application instances, wherein the first depth data comprise information of distances of pixels of first XR images of the streamed XR images and the second depth data comprise information of distances of pixels of second XR images of the streamed XR images from the XR device in a virtual landscape; and display the blended XR images on the projection surface.
  13. 13 . The system of claim 12 , wherein the one or more memories of the XR device further have stored therein instructions executable by the processor to cause the XR device to cutout those pixels of the streamed XR images that have a predefined cutout color, wherein the instructions to blend the streamed XR images based on priorities of the plurality of XR application instances comprise instructions executable by the processor to cause the XR device to take a pixel of a plurality of pixels of the streamed XR images corresponding to a highest priority if not cutout.
  14. 14 . The system of claim 12 , wherein the instructions executable by the processor to cause the XR device to transmit the momentary position data to the plurality of XR application instances comprises instructions executable by the processor to cause the XR device to transmit the momentary position data to the plurality of XR application instances in a synchronized manner.
  15. 15 . The system of claim 14 , wherein the XR device further comprises at least one camera, wherein the momentary position data is associated with a position of the at least one camera of the XR device.
  16. 16 . A non-transitory computer readable memories comprising instructions to: determine momentary position data of an extended reality (XR) device; transmit the momentary position data to instances of a plurality of XR applications, wherein the plurality of XR application instances comprise corresponding hardware or a combination of corresponding hardware and software that is configured to execute XR applications wherein the XR applications are augmented reality, mixed reality, or virtual reality applications, and wherein the XR application instances of the plurality of XR application instances are different from each other; blend XR images streamed from the plurality of XR application instances based on streamed first and second depth data and at least one of a positional blending technique, priorities of the plurality of XR application instances, and alpha data from the plurality of XR application instances, wherein the first depth data are streamed from a first XR application instance of the plurality of XR application instances and the second depth data are streamed from a second XR application instance of the plurality of XR application instances, wherein the first depth data comprise information of distances of pixels of first XR images of the streamed XR images and the second depth data comprise information of the distance of pixels of second XR images of the streamed XR images from the XR device in a virtual landscape; and wherein the instructions to blend the XR images comprise further instructions to blend the XR images based on prioritization of each pixel of the blended XR images based on a comparison of the first and second depth data; and display the blended XR images on a projection surface of the XR device.
  17. 17 . The non-transitory computer readable memories of claim 16 , wherein at least two different rendering engines are used by the plurality of XR application instances and the plurality of XR application instances render the XR images based, at least in part, on momentary position data of the XR device.
  18. 18 . The non-transitory computer readable memories of claim 16 , further comprising instructions to cutout those pixels of the streamed XR images that have a predefined cutout color, wherein the instructions to blend the streamed XR images based on priorities of the plurality of XR application instances comprise instructions to, for each pixel of corresponding XR images of different streams, take the pixel corresponding to a highest priority if not cutout.
  19. 19 . The non-transitory, computer readable memories of claim 16 , wherein the instructions to blend the XR images based on alpha data comprise instructions to blend the XR images based on alpha data streamed to the XR device from a first XR application instance and a second XR application instance of the plurality of XR application instances, wherein the alpha data comprise transparency information associated with pixels of the XR images, and wherein the first XR images received from the first XR application instance are blended with the second XR images received from the second XR application instance based on the alpha data.
  20. 20 . The non-transitory, computer readable memories of claim 16 , wherein the instructions to transmit the momentary position data to instances of a plurality of XR applications comprise instructions to transmit the momentary position data to the plurality of XR application instances in a synchronized manner.

Description

FIELD OF THE DISCLOSURE Embodiments of the present disclosure generally relate to an extended reality (XR) streaming method of streaming XR images between a plurality of XR application instances and an XR device. Embodiments of the present disclosure further relate to an XR streaming system. BACKGROUND In certain XR applications, XR images to be displayed on an XR device of a user are streamed from an XR application instance that is implemented in an external computer device to the XR device. The XR device receives and displays the XR image stream, i.e. the XR images associated with the XR image stream are displayed on a display of the XR device. With use cases of XR devices becoming ever more complex, e.g. in the field of mechanical and electrical engineering, there is a need to expand the capabilities of XR streaming systems in order to address the increasing complexity. Thus, there is a need for an XR streaming method and system that allow for more diverse use cases. SUMMARY The following summary of the present disclosure is intended to introduce different concepts in a simplified form that are described in further detail in the detailed description provided below. This summary is neither intended to denote essential features of the present disclosure nor shall this summary be used as an aid in determining the scope of the claimed subject matter. Embodiments of the present disclosure provide an extended reality (XR) streaming method of streaming XR images between a plurality of XR application instances and an XR device. The XR streaming method comprises the steps of: streaming, by means of a first XR application instance of the plurality of XR application instances, first XR images to the XR device, wherein the first XR images streamed by the first XR application instance are associated with a first XR application;streaming, by means of a second XR application instance of the plurality of XR application instances, second XR images to the XR device, wherein the second XR images streamed by the second XR application instance are associated with a second XR application; andblending, by means of the XR device, the first XR images received from the first XR application instance with the second XR images received from the second XR application instance, thereby obtaining blended XR images to be displayed on the XR device. Therein and in the following, the term “XR device” is understood to denote an electronic device that is configured to display an extended reality (XR) image, i.e. an augmented reality (AR) image, a mixed reality (MR) image, and/or a virtual reality (VR) image. For example, the XR device may be a head-mounted display, e.g. an electronic wearable having the shape of glasses. However, it is to be understood that the XR device may be established as any other XR-capable electronic device, e.g. as a smartphone or as a tablet. Moreover, the term “XR image” is understood to denote at least one (partially) virtual image. In the case of augmented reality or mixed reality, the XR image corresponds to at least one virtual image that is superimposed over reality. For example, the XR device may be a head-mounted display with a semi-transparent display, wherein the virtual image is displayed on the semi-transparent display, such that the user can directly see the environment through the semi-transparent display, but with the virtual image superimposed. As another example, the XR device may be a head-mounted display that is optically opaque. In this case, the head-mounted display may comprise at least one internal camera, particularly several internal cameras being configured to capture images of the environment of the head-mounted display. The real images captured by means of the internal camera are superimposed with the virtual image(s), and the resulting superposition of the real image(s) and the augmented reality image(s) is displayed on a display of the head-mounted display. As another example, the XR device may be a smartphone or a tablet, wherein an image captured by means of a camera of the XR device is superimposed with the at least one virtual image, and the resulting image is displayed on a display of the XR device. In the case of virtual reality, the XR image corresponds to a virtual image being displayed on a display of the XR device. For example, the XR device may be a head-mounted display that is optically opaque. The XR images, namely the VR images, may be displayed on a display of the head-mounted display. Further, the term “XR application instance” is understood to denote suitable hardware, suitable software, or a combination of hardware and software that is configured to execute a certain XR application. For example, the XR application may be an engineering application that is configured to generate XR images associated with a 3D model of an object, e.g. of a car, of an engine, or of any other object. In a particular example, a car may be recognized in the at least one image captured by me