Search

JP-2022542295-A5 -

JP2022542295A5JP 2022542295 A5JP2022542295 A5JP 2022542295A5JP-2022542295-A5

Dates

Publication Date
20230524
Application Date
20200615

Description

[0112] The program code may be executed by a processor which may include one or more processors, such as one or more digital signal processors (DSPs), general-purpose microprocessors, application-specific integrated circuits (ASICs), field-programmable logic arrays (FPGAs), or other equivalent integrated circuits or discrete logic circuits. Such a processor may be configured to implement any of the techniques described herein. A general-purpose processor may be a microprocessor, but alternatively, the processor may be any conventional processor, controller, microcontroller, or state machine. The processor may also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors working in conjunction with a DSP core, or any other such configuration. Thus, the term “processor” as used herein may refer to any of the above structures, any combination thereof, or any other structure or device suitable for implementing the techniques described herein. Furthermore, in some embodiments, the functions described herein may be provided in dedicated software or hardware modules configured for encoding and decoding, or incorporated into a composite video encoder/decoder (codec). The invention described in the original claims of this application is listed below. [C1] The first device displays virtual reality content, The first device is used to obtain a composite representation of the second device, The first device displays the composite representation of the second device together with the virtual reality content, The first device receives an input requesting a change in the function of the second device, A method comprising displaying a change in the composite representation of the second device based on the input received by the first device, and the change in the composite representation of the second device representing a change in the functionality of the second device. [C2] The method according to C1, wherein the composite representation of the second device includes a composite representation of the display of the second device, and the content displayed by the second device is displayed in the composite representation of the display of the second device. [C3] The first device receives instructions for one or more inputs processed by the second device, The method of C1, further comprising displaying additional changes to the composite representation of the second device based on the instructions of the one or more inputs processed by the second device, and the additional changes to the composite representation of the second device representing additional changes to the functionality of the second device. [C4] The method according to C1, wherein the composite representation of the second device is overlaid on the virtual reality content. [C5] The method according to C1, wherein the first device includes a virtual reality head-mounted display. [C6] The method according to C1, wherein the second device includes a mobile device. [C7] The first device acquires audio content from the second device, The method of C1, further comprising playing the audio content from the second device using the first device. [C8] The first device acquires the audio content captured by the microphone of the second device, The method according to C1, further comprising playing the audio content using the first device. [C9] The first device acquires one or more images captured by the camera of the second device, The method according to C1, further comprising displaying one or more images together with the virtual reality content using the first device. [C10] The method according to C9, wherein the one or more images are part of a video captured by the camera of the second device. [C11] The method of C9, wherein one or more of the images are displayed together with the virtual reality content as part of the composite representation of the second device. [C12] The method according to C9, wherein one or more of the images are displayed within a composite representation on the display of the second device. [C13] The first device acquires the audio content captured by the microphone of the second device, The method of C9, further comprising playing the audio content while the first device displays the one or more images together with the virtual reality content. [C14] Receiving a trigger and The method of C1, further comprising displaying the composite representation of the second device together with the virtual reality content based on the receipt of the trigger. [C15] The method of C14, wherein the trigger is based on information received from the second device. [C16] Receiving a delete trigger, The method of C1, further comprising the first device removing the composite representation of the second device from the display based on the receipt of the trigger. [C17] Memory configured to store content for display, A device comprising one or mo