Search

US-20260127833-A1 - ELECTRONIC DEVICE AND METHOD FOR TRACKING EXTERNAL OBJECT IN VIRTUAL ENVIRONMENT

US20260127833A1US 20260127833 A1US20260127833 A1US 20260127833A1US-20260127833-A1

Abstract

A wearable device is provided. The wearable device includes memory, including one or more storage media, storing instructions, a display, communication circuitry, and one or more processors including processing circuitry communicatively coupled to the memory, the display, and the communication circuitry, wherein the instructions, when executed by the one or more processors individually or collectively, cause the wearable device to, based on the communication circuitry, identify an external object which is a target of a position tracking, in a virtual reality (VR) mode being executed based on identifying an event, display, via the display, a screen including a virtual environment, identify whether a position of the external object is within a reference identification area including an identification area for lost detection of the VR mode, based on identifying that the position is within the reference identification area, display, via the display, the screen including a visual object for the external object, and based on identifying that the position is outside the reference identification area, convert the VR mode to an augmented reality (AR) mode.

Inventors

  • Doosuk KANG
  • Choonkyoung MOON
  • Hyunkee MIN
  • Mingyu Lee
  • Bokun Choi

Assignees

  • SAMSUNG ELECTRONICS CO., LTD.

Dates

Publication Date
20260507
Application Date
20260105
Priority Date
20230714

Claims (20)

  1. 1 . A wearable device comprising: memory, comprising one or more storage media, storing instructions; a display; communication circuitry; and one or more processors including processing circuitry, wherein the instructions, when executed by the one or more processors individually or collectively, cause the wearable device to: based on the communication circuitry, identify an external object which is a target of a position tracking, in a virtual reality (VR) mode being executed based on identifying an event, display, via the display, a screen including a virtual environment, identify whether a position of the external object is within a reference identification area including an identification area for lost detection of the VR mode, based on identifying that the position is within the reference identification area, display, via the display, the screen including a visual object for the external object, and based on identifying that the position is outside the reference identification area, convert the VR mode to an augmented reality (AR) mode.
  2. 2 . The wearable device of claim 1 , wherein the external object includes: a first electronic device providing information for the position tracking in a state where a connection with the wearable device is established; or a second electronic device providing information for the position tracking by transmitting and receiving a signal in a state where the connection with the wearable device is not established.
  3. 3 . The wearable device of claim 1 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: execute the VR mode based on obtaining the event, and wherein the event includes at least one of an input with respect to a physical button included in the wearable device for executing the VR mode, an input with respect to a partial area of the display, or a gesture of a user of the wearable device in a field of view (FOV) of the wearable device.
  4. 4 . The wearable device of claim 1 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: execute the AR mode based on identifying that a user of the wearable device is wearing the wearable device; and execute the VR mode converted from the AR mode based on identifying the event.
  5. 5 . The wearable device of claim 1 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: identify the identification area for the lost detection of the VR mode based on the VR mode being executed, wherein the identification area for the lost detection of the VR mode is included in another identification area for lost detection of the AR mode, and wherein the another identification area includes the reference identification area.
  6. 6 . The wearable device of claim 1 , wherein the identification area is identified based on at least one of complexity with respect to a surrounding area of the wearable device, a risk with respect to the surrounding area, a time at which the VR mode is executed, or a direction of a field of view (FOV) of the wearable device.
  7. 7 . The wearable device of claim 6 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: identify the position of the external object based on the communication circuitry, wherein the position is identified further based on a positioning technique in the AR mode, and wherein the position is identified further based on at least one of information obtained from an access point (AP) at which the wearable device and the external object are connected or information with respect to a signal received from the external object, together with the positioning technique in the VR mode.
  8. 8 . The wearable device of claim 1 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: identify whether the position is within the identification area; based on identifying that the position is outside the identification area, identify whether the position is within the reference identification area; and based on identifying that the position is within the identification area, identify whether the position is within a movement detection area with respect to the external object, and wherein the movement detection area is included in the identification area and used to identify lost detection of the external object.
  9. 9 . The wearable device of claim 8 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: based on identifying that the position is outside the movement detection area, identify whether the position of the external object is within a reference detection area including the movement detection area; and identify a new position of the external object based on identifying that the position is within the movement detection area.
  10. 10 . The wearable device of claim 9 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: display, via the display, the visual object for the external object based on identifying that the position is within the reference detection area; and execute the AR mode converted from the VR mode based on identifying that the position is outside the reference detection area.
  11. 11 . The wearable device of claim 1 , wherein the visual object includes a rendered image representing the external object or a text for notifying a risk of loss of the external object.
  12. 12 . The wearable device of claim 10 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: connect with an external electronic device performing the position tracking of the external object via the communication circuitry; and in response to an execution of the VR mode, transmit a signal notifying the execution of the VR mode to the external electronic device, and wherein the signal includes information with respect to the identification area and the reference identification area.
  13. 13 . The wearable device of claim 12 , wherein the instructions, when executed by the one or more processors individually or collectively, further cause the wearable device to: receive, from the external electronic device, another signal notifying that the position of the external object is outside the identification area; and identify whether the position is within the reference identification area based on the another signal.
  14. 14 . A method performed by a wearable device, the method comprising: identifying an external object which is a target of a position tracking; in a virtual reality (VR) mode being executed based on identifying an event, displaying, via a display, a screen including a virtual environment; identifying whether a position of the external object is within a reference identification area including an identification area for lost detection of the VR mode; based on identifying that the position is within the reference identification area, displaying, the screen including a visual object for the external object; and based on identifying that the position is outside the reference identification area, converting the VR mode to an augmented reality (AR) mode.
  15. 15 . The method of claim 14 , wherein the external object includes: a first electronic device providing information for the position tracking in a state where a connection with the wearable device is established; or a second electronic device providing information for the position tracking by transmitting and receiving a signal in a state where the connection with the wearable device is not established.
  16. 16 . The method of claim 14 , further comprising: executing the VR mode based on obtaining the event, wherein the event includes at least one of an input with respect to a physical button included in the wearable device for executing the VR mode, an input with respect to a partial area of the display, or a gesture of a user of the wearable device in a field of view (FOV) of the wearable device.
  17. 17 . The method of claim 14 , further comprising: executing the AR mode based on identifying that a user of the wearable device is wearing the wearable device; and executing the VR mode converted from the AR mode based on identifying the event.
  18. 18 . The method of claim 14 , further comprising: identifying the identification area for the lost detection of the VR mode based on the VR mode being executed, wherein the identification area for the lost detection of the VR mode is included in another identification area for lost detection of the AR mode, and wherein the another identification area includes the reference identification area.
  19. 19 . The method of claim 14 , wherein the identification area is identified based on at least one of complexity with respect to a surrounding area of the wearable device, a risk with respect to the surrounding area, a time at which the VR mode is executed, or a direction of a field of view (FOV) of the wearable device.
  20. 20 . One or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that, when executed by one or more processors of a wearable device individually or collectively, cause the wearable device to perform operations, the operations comprising: based on communication circuitry of the wearable device, identifying an external object which is a target of a position tracking; in a virtual reality (VR) mode being executed based on identifying an event, displaying via a display, a screen including a virtual environment; identifying whether a position of the external object is within a reference identification area including an identification area for lost detection of the VR mode; based on identifying that the position is within the reference identification area, displaying via the display, the screen including a visual object for the external object; and based on identifying that the position is outside the reference identification area, converting the VR mode to an augmented reality (AR) mode.

Description

CROSS-REFERENCE TO RELATED APPLICATION(S) This application is a continuation application, claiming priority under 35 U.S.C. § 365 (c), of an International application No. PCT/KR2024/006833, filed on May 21, 2024, which is based on and claims the benefit of a Korean patent application number 10-2023-0092074, filed on Jul. 14, 2023, in the Korean Intellectual Property Office, and of a Korean patent application number 10-2023-0110899, filed on Aug. 23, 2023, in the Korean Intellectual Property Office, the disclosure of each of which is incorporated by reference herein in its entirety. BACKGROUND 1. Field The disclosure relates to an electronic device for tracking an external object in a virtual environment and a method thereof. 2. Description of Related Art In order to provide enhanced user experience, an electronic device that provides an extended reality service that displays information generated by a computer in connection with an external object in a real world or a virtual object in a virtual world is being developed. The electronic device may include a wearable device that may be worn by a user. For example, the electronic device may include user equipment, An augmented reality (AR) glasses, virtual reality (VR) glasses, and/or a head-mounted device (HMD) (e.g., a video see through (VST) HMD and an optical see through (OST) HMD). The above information is presented as background information only to assist with an understanding of the disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the disclosure. SUMMARY Aspects of the disclosure are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the disclosure is to provide an electronic device for tracking an external object in a virtual environment and a method thereof. Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments. In accordance with an aspect of the disclosure, a wearable device is provided. The wearable device includes memory, including one or more storage media, storing instructions, a display, communication circuitry, and one or more processors communicatively coupled to the memory, the display, and the communication circuitry, wherein the instructions, when executed by the one or more processors individually or collectively, cause the wearable device to, based on the communication circuitry, identify an external object which is a target of a position tracking, in a virtual reality (VR) mode being executed based on identifying an event, display, via the display, a screen including a virtual environment, identify whether a position of the external object is within a reference identification area including an identification area for lost detection of the VR mode, based on identifying that the position is within the reference identification area, display, via the display, the screen including a visual object for the external object, and based on identifying that the position is outside the reference identification area, convert the VR mode to an augmented reality (AR) mode. According to an embodiment, the external object may include an electronic device providing information for the position tracking in a state where a connection with the wearable device is established, or an electronic device for providing information for the position tracking by transmitting and receiving a signal in a state where the connection with the wearable device is not established. In accordance with another aspect of the disclosure, a method performed by a wearable device is provided. The method includes identifying an external object which is a target of a position tracking, in a virtual reality (VR) mode being executed based on identifying an event, displaying, via a display, a screen including a virtual environment, identifying whether a position of the external object is within a reference identification area including an identification area for lost detection of the VR mode, based on identifying that the position is within the reference identification area, displaying the screen including a visual object for the external object, and based on identifying that the position is outside the reference identification area, converting the VR mode to an augmented reality (AR) mode. In accordance with another aspect of the disclosure, one or more non-transitory computer-readable storage media storing one or more computer programs including computer-executable instructions that, when executed by one or more processors of a wearable device individually or collectively, cause the wearable device to perform operations are provided. The operations include, based on communication circuitry of the wearable device, identifying an external object which is a t