US-20260126886-A1 - ELECTRONIC DEVICE, METHOD, AND COMPUTER-READABLE STORAGE MEDIUM FOR DISPLAYING VISUAL OBJECT RELATED TO APPLICATION IN VIRTUAL SPACE
Abstract
A wearable device is provided. The wearable device includes a display, at least one sensor, memory, including one or more storage media, storing instructions, and at least one processor communicatively coupled to the display, the at least one sensor, and the memory, wherein the instructions, when executed by the at least one processor individually or collectively, cause the wearable device to identify at least one application being executed in a first virtual space, while the first virtual space is displayed, switch the first virtual space to a second virtual space based on the execution of a first application, identify a second application executed in the first virtual space among the at least one application, while the second virtual space is displayed, identify a motion of the wearable device while displaying a first portion of the second virtual space based on a first direction in which a user's gaze is directed, and display a visual object related to the second application in a second portion of the second virtual space based on a second direction in which the user's gaze is directed, the second direction being changed from the first direction according to the motion of the wearable device.
Inventors
- Sangheon KIM
- Youngjung KIM
- Miji PARK
- Hungi PARK
- Jinwan AN
- Sangyong Lee
- Jiwoo Lee
Assignees
- SAMSUNG ELECTRONICS CO., LTD.
Dates
- Publication Date
- 20260507
- Application Date
- 20251231
- Priority Date
- 20230724
Claims (20)
- 1 . A wearable device comprising: a display; at least one sensor; memory, comprising one or more storage media, storing instructions; and at least one processor, comprising processing circuitry, communicatively coupled to the display, the at least one sensor, and the memory, wherein the instructions, when executed by the at least one processor individually or collectively, cause the wearable device to: while a first virtual space is displayed, identify at least one application being executed in the first virtual space, identify an input for executing a first application in the first virtual space, based on the execution of the first application, switch the first virtual space to a second virtual space provided according to the first application, while the second virtual space is displayed, identify a second application executed in the first virtual space among the at least one application, identify a motion of the wearable device during displaying a first portion of the second virtual space based on a first direction in which a user's gaze is directed, and display a visual object related to the second application in a second portion of the second virtual space based on a second direction in which the user's gaze is directed changed from the first direction according to the motion of the wearable device.
- 2 . The wearable device of claim 1 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to: display, based on identifying an input for the visual object, an interface related to the second application within a designated region in the second portion of the second virtual space.
- 3 . The wearable device of claim 1 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to: based on identifying an input for the visual object, perform a switch from the second virtual space to the first virtual space in which the second application is executed, and display an interface related to the second application in the first virtual space.
- 4 . The wearable device of claim 1 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to: display an object including one or more elements related to one or more applications being executed in at least one virtual space distinct from the second virtual space on a designated region in the first portion of the second virtual space.
- 5 . The wearable device of claim 4 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to: identify an input to an element related to the second application, among the one or more elements, and based on identifying the input to the element, display an interface related to the second application in the first portion in the second virtual space.
- 6 . The wearable device of claim 5 , wherein the element is displayed based on execution information of an operation related to the second application.
- 7 . The wearable device of claim 4 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to display, based on identifying an input for removing the object, an interface related to the second application in the first portion in the second virtual space.
- 8 . The wearable device of claim 5 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to remove, based on identifying an input for switching from the second virtual space to the first virtual space, the element in the object displayed in the first virtual space.
- 9 . The wearable device of claim 1 , wherein the instructions, when executed by the at least one processor individually or collectively, further cause the wearable device to display, based on the second direction, at least one interface related to the at least one application in the second portion of the second virtual space.
- 10 . The wearable device of claim 1 , wherein the first virtual space is configured to comprise at least one region for executing the at least one application, and wherein the second virtual space is configured to provide an interface related to the first application.
- 11 . A method of a wearable device, the method comprising: while a first virtual space is displayed, identifying at least one application being executed in the first virtual space; identifying an input for executing a first application in the first virtual space; based on the execution of the first application, switching the first virtual space to a second virtual space provided according to the first application; while the second virtual space is displayed, identifying a second application executed in the first virtual space among the at least one application; identifying a motion of the wearable device during displaying a first portion of the second virtual space based on a first direction in which a user's gaze is directed; and displaying a visual object related to the second application in a second portion of the second virtual space based on a second direction in which the user's gaze is directed changed from the first direction according to the motion of the wearable device.
- 12 . The method of claim 11 , further comprising displaying, based on identifying an input for the visual object, an interface related to the second application within a designated region in the second portion of the second virtual space.
- 13 . The method of claim 11 , further comprising: based on identifying an input for the visual object, performing a switch from the second virtual space to the first virtual space in which the second application is executed; and displaying an interface related to the second application in the first virtual space.
- 14 . The method of claim 11 , further comprising: displaying an object including one or more elements related to one or more applications being executed in one or more virtual spaces distinct from the second virtual space on a designated region in the first portion of the second virtual space.
- 15 . The method of claim 14 , further comprising: identifying an input to an element related to the second application, among the one or more elements; and based on identifying the input to the element, displaying an interface related to the second application in the first portion in the second virtual space.
- 16 . The method of claim 15 , wherein the element is displayed based on execution information of an operation related to the second application.
- 17 . The method of claim 14 , further comprising: displaying, based on identifying an input for removing the object, an interface related to the second application in the first portion in the second virtual space.
- 18 . The method of claim 15 , further comprising: removing, based on identifying an input for switching from the second virtual space to the first virtual space, the element in the object displayed in the first virtual space.
- 19 . One or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that, when executed by one or more processors of a wearable device with a display and at least one sensor individually or collectively, cause the wearable device to perform operations, the operations comprising: while a first virtual space is displayed, identifying at least one application being executed in the first virtual space; identifying an input for executing a first application in the first virtual space; based on the execution of the first application, switching the first virtual space to a second virtual space provided according to the first application; while the second virtual space is displayed, identifying a second application executed in the first virtual space among the at least one application; identifying a motion of the wearable device during displaying a first portion of the second virtual space based on a first direction in which a user's gaze is directed; and displaying a visual object related to the second application in a second portion of the second virtual space based on a second direction in which the user's gaze is directed changed from the first direction according to the motion of the wearable device.
- 20 . The one or more non-transitory computer-readable storage media of claim 19 , the operations further comprising displaying, based on identifying an input for the visual object, an interface related to the second application within a designated region in the second portion of the second virtual space.
Description
CROSS-REFERENCE TO RELATED APPLICATION(S) This application is a continuation application, claiming priority under 35 U.S.C. § 365(c), of an International application No. PCT/KR2024/007588, filed on Jun. 3, 2024, which is based on and claims the benefit of a Korean patent application number 10-2023-0096421, filed on Jul. 24, 2023, in the Korean Intellectual Property Office, and of a Korean patent application number 10-2023-0128332, filed on Sep. 25, 2023, in the Korean Intellectual Property Office, the disclosure of each of which is incorporated by reference herein in its entirety. BACKGROUND 1. Field The disclosure relates to an electronic device, a method, and a computer readable storage medium for displaying a visual object related to an application in a virtual space. 2. Description of Related Art In order to provide an enhanced user experience, an electronic device that provides augmented reality (AR) and/or virtual reality (VR) services that display information generated by a computer in connection with an external object in the real-world is being developed. The electronic device may be a wearable device that may be worn by a user. For example, the electronic device may be AR glasses and/or a head-mounted device (HMD). The above information is presented as background information only to assist with an understanding of the disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the disclosure. SUMMARY Aspects of the disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the disclosure is to provide an electronic device, a method, and a computer readable storage medium for displaying a visual object related to an application in a virtual space. Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments. In accordance with an aspect of the disclosure, a wearable device is provided. The wearable device includes a display, at least one sensor, memory, including one or more storage media, storing instructions, and at least one processor, comprising processing circuitry, communicatively coupled to the display, the at least one sensor, and the memory, wherein the instructions, when executed by the at least one processor individually or collectively, cause the wearable device to, while a first virtual space is displayed, identify at least one application being executed in the first virtual space, identify an input for executing a first application in the first virtual space, based on the execution of the first application, switch the first virtual space to a second virtual space provided according to the first application, while the second virtual space is displayed, identify a second application executed in the first virtual space among the at least one application, identify a motion of the wearable device during displaying a first portion of the second virtual space based on a first direction in which a user's gaze is directed, and display a visual object related to the second application in a second portion of the second virtual space based on a second direction in which the user's gaze is directed changed from the first direction according to the motion of the wearable device. In accordance with another aspect of the disclosure, a method of a wearable device is provided. The method includes, while a first virtual space is displayed, identifying at least one application being executed in the first virtual space, identifying an input for executing a first application in the first virtual space, based on the execution of the first application, switching the first virtual space to a second virtual space provided according to the first application, while the second virtual space is displayed, identifying a second application executed in the first virtual space among the at least one application, identifying a motion of the wearable device during displaying a first portion of the second virtual space based on a first direction in which a user's gaze is directed, displaying a visual object related to the second application in a second portion of the second virtual space based on a second direction in which the user's gaze is directed changed from the first direction according to the motion of the wearable device. In accordance with another aspect of the disclosure, one or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that, when executed by one or more processors of a wearable device with a display and at least one sensor individually or collectively, cause the wearable device to perform operations are provided. The operations include, while a first virtual space is displayed, identifying at least o