KR-102961223-B1 - ELECTRONIC APPARATUS AND CONTROLLING METHOD THEREOF
Abstract
An electronic device is disclosed. The electronic device comprises a camera, a display, a communication interface for communicating with an external device, a memory for storing at least one instruction, and a processor that, when a user’s gesture captured through the camera by executing at least one instruction is a capture gesture which is a preset gesture, acquires information regarding a setting state of at least one of the electronic device and the external device, controls the display to display a UI containing information regarding the setting state, and, when a first gesture using a specific object is input while the UI is displayed, matches the input first gesture with the acquired information regarding the setting state and stores it in memory as a custom gesture.
Inventors
- 황성준
- 김대웅
- 마지연
- 신동헌
- 김의준
- 이영아
Assignees
- 삼성전자주식회사
Dates
- Publication Date
- 20260507
- Application Date
- 20210723
Claims (17)
- In electronic devices, camera, display, A communication interface that communicates with an external device, Memory for storing at least one instruction, and Includes a processor, The above processor executes the above at least one instruction, If the user's gesture captured through the camera is a capture gesture that is a preset gesture, information regarding the setting status of at least one of the electronic device and the external device is obtained, and Control the display to display a UI containing information about the above setting state, and When a first gesture using a specific object is input while the above UI is displayed, the input first gesture and information regarding the acquired setting state are matched and stored in the memory as a custom gesture, and Controls the above camera to acquire an image corresponding to the user's body and gaze direction, and An electronic device that controls a display to display a UI corresponding to performing a function corresponding to gesture recognition, based on the image acquired above, when the direction of gaze corresponds to the electronic device for a threshold time or longer and the angle of the arm of the body is within a preset range.
- In paragraph 1, The above processor An electronic device that changes the setting state of the electronic device and the external device based on information about the setting state stored and matched to the first gesture, if the user's gesture captured through the camera is the custom gesture.
- In paragraph 1, The above processor Control the display to display a UI for saving the custom gesture, and When the first gesture to be saved as the custom gesture is recognized while the UI for saving the custom gesture is displayed, the display is controlled to display a first UI element for indicating the degree of input of the first gesture until the first gesture is input, and the first UI element includes a first image that visually represents the first gesture. While the first gesture is being input, the first UI element is modified according to the degree of input of the first gesture so that the shape of the first UI element becomes clear, and An electronic device that, when the first gesture is changed to a second gesture while the first UI element is being changed, changes the first UI element to a second UI element, and the second UI element includes a second image that visually represents the second gesture.
- delete
- In paragraph 1, The above processor An electronic device that changes the threshold time or the preset range based on whether the captured gesture or custom gesture is recognized while a UI indicating that the electronic device is recognizing a gesture is displayed.
- In paragraph 1, The UI containing information about the above-mentioned setting state includes a plurality of UI elements corresponding to each of the information about the above-mentioned setting state, and The above processor An electronic device that, when an input for selecting at least one of a plurality of UI elements is detected, matches information regarding a setting state corresponding to the first gesture and the selected UI element based on the input and stores it in the memory.
- In paragraph 6, The above processor An electronic device that changes the range of UI elements selected based on the direction pointed by both hands of the user or the distance between both hands of the user while a UI containing information about the above-mentioned setting state is displayed.
- In paragraph 2, The above specific object includes at least one of a game controller, a mouse, a remote control, and a mobile phone, and The above processor An electronic device that changes the setting state of the electronic device and the external device based on information regarding the setting state corresponding to the specific object among the information regarding the stored setting state.
- In a method for controlling an electronic device, If the user's gesture captured through the camera is a captured gesture that is a preset gesture, the step of obtaining information about the setting status of at least one of the electronic device and an external device connected to the electronic device; A step of controlling a display to display a UI containing information about the above-mentioned setting state; When a first gesture using a specific object is input while the above UI is displayed, a step of matching the first gesture with information regarding the acquired setting state and storing it in memory as a custom gesture; A step of acquiring an image corresponding to the user's body and gaze direction through the camera; and A control method comprising the step of controlling the display to display a UI corresponding to performing a function corresponding to gesture recognition, based on the acquired image, if the direction of gaze corresponds to the electronic device for a threshold time or longer and the angle of the arm of the body is within a preset range.
- In Paragraph 9, A control method further comprising the step of changing the setting state of the electronic device and the external device connected to the electronic device based on information regarding the setting state stored and matched to the first gesture, if the user's gesture captured through the camera is the custom gesture.
- In Paragraph 9, A step of displaying a UI for saving the above custom gesture; When the first gesture to be saved as the custom gesture is recognized while the UI for saving the custom gesture is displayed, a first UI element for indicating the degree of input of the first gesture is displayed until the first gesture is input, and the first UI element includes a first image that visually represents the first gesture; A step of changing the first UI element so that the shape of the first UI element becomes clear according to the degree of input of the first gesture while the first gesture is input; and A control method further comprising the step of, when the first gesture is changed to a second gesture while the first UI element is being changed, changing the first UI element to a second UI element, and the second UI element including a second image that visually represents the second gesture.
- delete
- In Paragraph 9, A control method further comprising the step of changing the threshold time or the preset range based on whether the captured gesture or the custom gesture is recognized while a UI indicating that the electronic device is recognizing a gesture is displayed.
- In Paragraph 9, A UI including information about the above-mentioned setting state includes a plurality of UI elements corresponding to each of the information about the above-mentioned setting state, and A control method further comprising the step of, when an input for selecting at least one of a plurality of UI elements is detected, matching information regarding a setting state corresponding to the first gesture and the selected UI element based on the input and storing it in the memory.
- In Paragraph 14, A control method further comprising the step of changing the range of UI elements selected based on the direction pointed by both hands of the user or the distance between both hands of the user while a UI including information about the above-mentioned setting state is displayed.
- In Paragraph 9, The above specific object includes at least one of a game controller, a mouse, a remote control, and a mobile phone, and A control method further comprising the step of changing the setting state of the electronic device and the external device based on information regarding the setting state corresponding to the specific object among the information regarding the stored setting state.
- In a non-transient computer-readable recording medium having a program that can be executed by a control unit to execute a control method for an electronic device, the control method is If the user's gesture captured through the camera is a capture gesture that is a preset gesture, the step of obtaining information about the setting status of at least one of the electronic device and an external device connected to the electronic device; A step of controlling a display to display a UI containing information about the above-mentioned setting state; When a first gesture is input while the above UI is displayed, a step of matching the first gesture with information regarding the acquired setting state and storing it in memory as a custom gesture; A step of acquiring an image corresponding to the user's body and gaze direction through the camera; and A non-transient computer-readable recording medium comprising: a step of controlling the display to display a UI corresponding to performing a function corresponding to gesture recognition, based on the acquired image, if the direction of gaze corresponds to the electronic device for more than a threshold time and the angle of the arm of the body is within a preset range.
Description
ELECTRONIC APPARATUS AND CONTROLLING METHOD THEREOF } The present invention relates to an electronic device and a method for controlling the same, and more specifically, to an electronic device and method for storing a custom gesture in an electronic device and changing the settings of the electronic device and an external device using the custom gesture. With the advancement of technology, various input methods for controlling electronic devices exist in addition to remote controls, such as voice gestures. To manage the increasingly diverse functions and content of electronic devices like TVs, the use of gesture and voice controls is becoming more common, moving beyond traditional remotes. Such gesture controls are intuitive and convenient, offering usability and functionality distinct from conventional methods. Electronic devices typically use cameras to recognize user gestures. The device utilizes an attached camera to acquire visual images of the user's body and recognizes the shapes of the user's arms and hands to aid in gesture recognition. Controlling electronic devices using gestures presents a problem in that users must remember the specific gesture they wish to use. Since most gestures are predefined by the device manufacturers, users face the inconvenience of having to memorize these predefined gestures. Furthermore, because it is technically difficult to recognize the starting point of a gesture, separate gestures such as wake-up gestures exist, causing inconvenience to users. Additionally, there is a problem where the device malfunctions by recognizing a gesture even when the user has no intention of using it. FIG. 1 is a block diagram for explaining the configuration of an electronic device according to one embodiment of the present disclosure. FIGS. 2a and 2b are drawings for explaining a method of displaying a first UI indicating that an electronic device according to one embodiment of the present disclosure is recognizing a gesture. FIG. 3 is a flowchart illustrating a method for an electronic device according to one embodiment of the present disclosure to change conditions for displaying a first UI. FIGS. 4a to 4e are drawings for illustrating a method of displaying a second UI and selecting information about a setting state according to one embodiment of the present disclosure. FIG. 5 is a drawing for explaining the process of storing a custom gesture according to one embodiment of the present disclosure. FIGS. 6a to 6d are drawings for explaining the process of storing a custom gesture according to one embodiment of the present disclosure. FIGS. 7a to 7d are drawings for explaining the process of storing a custom gesture according to one embodiment of the present disclosure. FIG. 8 is a drawing for explaining a method of inputting a gesture according to one embodiment of the present disclosure. FIG. 9 is a drawing illustrating a method for changing the setting state of an electronic device (100) and an external device connected to the electronic device using a custom gesture according to one embodiment of the present disclosure. FIG. 10 is a flowchart illustrating a method for storing information about the setting state of at least one of an electronic device and an external device connected to the electronic device according to one embodiment of the present disclosure by matching it with a custom gesture. The embodiments described herein are subject to various modifications and may have various forms; specific embodiments are illustrated in the drawings and described in detail in the detailed description. However, this is not intended to limit the scope of specific embodiments and should be understood to include various modifications, equivalents, and/or alternatives of the embodiments of the present disclosure. In relation to the description of the drawings, similar reference numerals may be used for similar components. In describing the present disclosure, if it is determined that a detailed description of related known functions or configurations could unnecessarily obscure the essence of the present disclosure, such detailed description is omitted. Additionally, the following embodiments may be modified in various other forms, and the scope of the technical concept of the present disclosure is not limited to the following embodiments. Rather, these embodiments are provided to make the present disclosure more faithful and complete and to fully convey the technical concept of the present disclosure to those skilled in the art. The terms used in this disclosure are used merely to describe specific embodiments and are not intended to limit the scope of the rights. Singular expressions include plural expressions unless the context clearly indicates otherwise. In the present disclosure, expressions such as “have,” “may have,” “include,” or “may include” indicate the presence of such features (e.g., numerical values, functions, actions, or components such as parts) and do not exclude the p