Search

EP-4742024-A1 - METHOD AND ELECTRONIC DEVICE FOR PROVIDING USER INTERACTION FOR STEREOSCOPIC ELEMENT

EP4742024A1EP 4742024 A1EP4742024 A1EP 4742024A1EP-4742024-A1

Abstract

An embodiment of the present invention may comprise: a display (160); a sensor module (176); a memory (130) for storing instructions; and a processor (120). The instructions, when executed by the processor, may cause an electronic device to: detect a user input for calling a control menu while a three-dimensional stereoscopic object is displayed on the display; display the control menu on the display in response to the detected user input; detect a touch selecting an option item of the control menu; detect movement of the electronic device by using the sensor module while the touch on the selected option item is maintained; and change a user interface displayed on the display on the basis of the detected movement of the electronic device when the touch on the selected option item is released.

Inventors

  • LEE, YONGYEON
  • PARK, JIHAE
  • LEE, BYUNGHWA
  • HAN, HOON

Assignees

  • Samsung Electronics Co., Ltd.

Dates

Publication Date
20260513
Application Date
20240710

Claims (15)

  1. An electronic device (101) comprising: a display (160); a sensor module (176); a memory (130) configured to store instructions; and a processor (120), wherein the instructions, when executed by the processor, cause the electronic device to: while a three-dimensional (stereoscopic) object is displayed on the display, detect a user input for invoking a control menu; display the control menu on the display in response to the detected user input; detect a touch selecting an option item of the control menu; while the touch on the selected option item is maintained, detect movement of the electronic device by using the sensor module; and in case that the touch on the selected option item is released, change a user interface displayed on the display based on the detected movement of the electronic device.
  2. The electronic device of claim 1, wherein the instructions, when executed by the processor, cause the electronic device to: display the control menu in a corner edge area of the display in response to the detected user input; and display a pointer at the center of the display.
  3. The electronic device of claim 1, wherein the instructions, when executed by the processor, cause the electronic device to: in case that a depth movement is selected as an option item of the control menu, acquire first sensing data in a depth direction from the sensor module; and change depth information of a three-dimensional object displayed on the display according to the touch release, based on the acquired first sensing data.
  4. The electronic device of claim 1, wherein the instructions, when executed by the processor, cause the electronic device to: in case that a camera view change is selected as an option item of the control menu, acquire second sensing data in a depth direction from the sensor module; change a camera view, based on the acquired second sensing data; and change the user interface, based on the changed camera view.
  5. The electronic device of claim 1, wherein the instructions, when executed by the processor, cause the electronic device to: in case that a screen panning is selected as an option item of the control menu, acquire third sensing data from the sensor module; and change the user interface, based on the acquired third sensing data.
  6. The electronic device of claim 2, wherein the instructions, when executed by the processor, cause the electronic device to, in case that a three-dimensional object corresponds to the pointer based on the acquired sensing data, select the three-dimensional object.
  7. The electronic device of claim 6, wherein the instructions, when executed by the processor, cause the electronic device to, in case that the touch on the selected option item is released, execute a function corresponding to the selected three-dimensional object.
  8. The electronic device of claim 2, wherein the instructions, when executed by the processor, cause the electronic device to: in case that a three-dimensional object is positioned adjacent to a highlight area corresponding to the pointer based on the acquired sensing data, select the positioned three-dimensional object; and in case that the touch release is detected, execute a function corresponding to the selected three-dimensional object.
  9. The electronic device of claim 8, wherein the instructions, when executed by the processor, cause the electronic device to: move, to the highlight area, the three-dimensional object disposed adjacent to the highlight area; and in case that the three-dimensional object is aligned with the highlight area, select the three-dimensional object.
  10. The electronic device of claim 8, wherein the instructions, when executed by the processor, cause the electronic device to: move, to the highlight area, a first three-dimensional object disposed adjacent to the highlight area; and move the position of a second three-dimensional object displayed on the display, based on the movement of the first three-dimensional object.
  11. An operation method of an electronic device (101), the method comprising: while a three-dimensional (stereoscopic) object is displayed on a display (160) of the electronic device, detecting a user input for invoking a control menu; displaying the control menu on the display in response to the detected user input; detecting a touch selecting an option item of the control menu; while the touch on the selected option item is maintained, detecting movement of the electronic device by using the sensor module (176); and in case that the touch on the selected option item is released, changing a user interface displayed on the display based on the detected movement of the electronic device.
  12. The method of claim 11, wherein the displaying of the control menu comprises, in response to the detected user input, displaying the control menu in a corner edge area of the display, and displaying a pointer at the center of the display.
  13. The method of claim 11, further comprising: in case that a depth movement is selected as an option item of the control menu, acquiring first sensing data in a depth direction from the sensor module; and changing depth information of a three-dimensional object displayed on the display according to the touch release, based on the acquired first sensing data.
  14. The method of claim 11, further comprising: in case that a camera view change is selected as an option item of the control menu, acquiring second sensing data in a depth direction from the sensor module; changing a camera view, based on the acquired second sensing data; and changing the user interface, based on the changed camera view.
  15. The method of claim 11, comprising: in case that a screen panning is selected as an option item of the control menu, acquiring third sensing data from the sensor module; and changing the user interface, based on the acquired third sensing data.

Description

[Technical Field] Various embodiments of the disclosure provide a method and an electronic device for providing user interaction for a stereoscopic element. [Background Art] With the development of digital technology, various types of electronic devices such as mobile communication terminals, personal digital assistants (PDAs), electronic notebooks, smart phones, tablet personal computers (PCs), and wearable devices are widely used. To support and enhance functions of such electronic devices, the hardware and/or software of the electronic devices are continuously improved. For example, an electronic device may control execution of the electronic device or an application or perform a function using a web-based service through a user's voice command by using voice recognition technology. Alternatively, the electronic device may connect to a wireless input/output device (e.g., earphone or headphone) through short-range wireless communication such as Bluetooth, and may output the sound of music or a video through the wireless input/output device. Alternatively, the electronic device (e.g., a smartphone) may be connected to a wearable display device (e.g., AR glasses) to provide extended reality (XR) content such as virtual reality (VR), augmented reality (AR), and/or mixed reality (MR) content. The above information may be presented as related art for the purpose of assisting in understanding the disclosure. No assertion or decision is made as to whether any of the above might be applicable as prior art with regard to the disclosure. [Disclosure of Invention] [Technical Problem] An embodiment may provide a method and device for providing a control menu configured to enable a user to operate an electronic device with one hand without obscuring a three-dimensional object displayed on a display of the electronic device, and changing a user interface through the control menu. [Solution to Problem] An electronic device 101 according to an embodiment of the disclosure may include a display 160, a sensor module 176, a memory 130 configured to store instructions, and a processor 120. The instructions, when executed by the processor, may cause the electronic device to detect a user input for invoking a control menu while a three-dimensional (stereoscopic) object is displayed on the display, display the control menu on the display in response to the detected user input, detect a touch selecting an option item of the control menu, detect movement of the electronic device by using the sensor module while the touch on the selected option item is maintained, and change a user interface displayed on the display based on the detected movement of the electronic device in case that the touch on the selected option item is released. A method for operating the electronic device 101 according to an embodiment of the disclosure may include detecting a user input for invoking a control menu while a three-dimensional (stereoscopic) object is displayed on the display 160 of the electronic device, displaying the control menu on the display in response to the detected user input, detecting a touch selecting an option item of the control menu, detecting the movement of the electronic device by using the sensor module 176 of the electronic device while the touch on the selected option item is maintained, and changing a user interface displayed on the display, based on the detected movement of the electronic device in case that the touch on the selected option item is released. [Advantageous Effects of Invention] According to an embodiment, a display may be controlled without obscuring a three-dimensional element. According to an embodiment, a three-dimensional element may be controlled conveniently without obscuring the three-dimensional element while holding an electronic device with one hand. According to an embodiment, an environment in which a user interaction may be performed without being affected by a field of view for a three-dimensional element may be provided, thereby improving the usability of an electronic device. [Brief Description of Drawings] FIG. 1 is a block diagram of an electronic device in a network environment according to an embodiment.FIG. 2 is a diagram illustrating an example of controlling a three-dimensional object in an electronic device according to an embodiment.FIG. 3 is a flowchart illustrating an operation method of an electronic device according to an embodiment.FIG. 4 is a diagram illustrating an example of invoking a control menu in an electronic device according to an embodiment.FIG. 5A is a diagram illustrating an example of selecting a depth movement of a control menu in an electronic device according to an embodiment.FIG. 5B is a diagram illustrating an example of selecting a camera view change in a control menu in an electronic device according to an embodiment.FIG. 5C is a diagram illustrating an example of selecting a screen movement of a control menu in an electronic device according to an embodi