KR-102961797-B1 - ELECTRONIC DEVICE FOR IDENTIFYING STATE USING SENSOR
Abstract
According to one embodiment, an electronic device comprises: a first housing including a first surface and a second surface facing and separated from the first surface; a second housing including a third surface and a fourth surface facing and separated from the third surface; a folding housing pivotably connecting a side of the first housing and a side of the second housing facing the side of the first housing; at least one inertial sensor within at least one of the first housing or the second housing; a first display disposed across the folding housing on the first surface and the third surface; a second display disposed on the second surface or the fourth surface; and while the second display is disabled in a folding state in which the first surface faces the third surface, the electronic device acquires first data indicating the orientation of the electronic device and second data indicating the movement state of the electronic device through the at least one inertial sensor, and identifies that the electronic device is moved to a specified orientation based on the first data and the second data. It may include at least one processor configured to detect an event, change the time at which the second display is activated according to the specified event based on the reception of a touch input to the activated second display, and maintain the time independently of the reception of the touch input based on the detection of the specified event while identifying that the electronic device is moved to a posture distinct from the specified posture based on the first data and the second data.
Inventors
- 조대현
- 김우영
- 이원희
- 최희준
Assignees
- 삼성전자주식회사
Dates
- Publication Date
- 20260507
- Application Date
- 20211014
Claims (20)
- In electronic devices, A first housing comprising a first surface and a second surface opposite to the first surface; A second housing comprising a third surface and a fourth surface opposite to the third surface; A folding housing that rotatably connects the first housing and the second housing; At least one inertial sensor in at least one of the first housing or the second housing; A first display disposed on the first surface and the third surface across the folding housing; A second display disposed on the second surface or the fourth surface; At least one processor; and It includes memory for storing instructions, When the above instructions are executed by the at least one processor, While the first display and the second display are disabled within a folding state in which the first surface faces the third surface, a designated event is detected to enable the second display among the first display and the second display, and Based on the above-specified event, the second display among the first display and the second display is activated within the folding state, and While the second display is activated based on the designated event within the folding state, a touch input is received on the activated second display within the folding state, and Based on the touch input received while identifying through the at least one inertial sensor that the posture of the electronic device within the above folding state is a designated posture determined according to the direction of gravitational acceleration, the time at which the second display is activated according to the designated event is changed, and In order to maintain the time based on the touch input being received while identifying through the at least one inertial sensor that the posture of the electronic device within the above folding state is a posture different from the specified posture, causing the above electronic device, Electronic device.
- In claim 1, When the above instructions are executed by the at least one processor, While the first display and the second display are deactivated within the folding state, first data indicating the attitude of the electronic device within the folding state and second data indicating a change in the position of the electronic device within the folding state are obtained through the at least one inertial sensor, and While the second display is activated based on the designated event within the folding state, the touch input on the activated second display within the folding state is received, and Based on the touch input being received while identifying that the position of the electronic device having the specified posture is maintained based on the first data and the second data, the time is changed, and Based on the first data and the second data, to change the time based on the touch input being identified while identifying that the position of the electronic device having a position different from the designated position is maintained. causing the above electronic device, Electronic device.
- In claim 1, When the above instructions are executed by the at least one processor, Identifying the difference in length of vectors obtained through the at least one inertial sensor while the second display is deactivated, and Based on identifying that the above length difference is within a specified range, to obtain first data representing the attitude of the electronic device, causing the above electronic device, Electronic device.
- In claim 3, When the above instructions are executed by the at least one processor, Based on identifying that the number of vectors within the specified range of the above length difference reaches a specified number, to obtain the first data, causing the above electronic device, Electronic device.
- In claim 4, When the above instructions are executed by the at least one processor, Based on identifying that the above number reaches the above specified number, to identify the relationship between each of the x-axis component, y-axis component, and z-axis component of each of the above vectors and the acceleration due to gravity, Causing the above electronic device, and The above first data is, indicating the above relationship, Electronic device.
- In claim 1, It further includes a proximity sensor included within the first display or positioned below the first display, and When the above instructions are executed by the at least one processor, While the first display and the second display are deactivated within the folding state, first data indicating the attitude of the electronic device within the folding state and second data indicating a change in the position of the electronic device within the folding state are obtained through the at least one inertial sensor, and While the second display is deactivated within the folding state, determining whether the electronic device is in a first state in which it is contained within an external object in an ungripped state through the at least one inertial sensor among the at least one inertial sensor and the proximity sensor, or in a second state different from the first state, While the electronic device is in an unfolded state in which the first surface and the third surface form a single flat surface, the electronic device identifies whether it is in the first state or in the second state through the proximity sensor among the at least one inertial sensor and the proximity sensor. Causing the above electronic device, The above first data and the above second data are, To identify whether the electronic device is in the first state or in the second state, it is obtained, When the above instructions are executed by the at least one processor, Based on identifying that the electronic device is moved to the designated posture based on the first data and the second data, identifying that the electronic device is within the first state, and Based on identifying that the electronic device is moved to the posture distinguished from the designated posture based on the first data and the second data, to identify that the electronic device is in the second state. causing the above electronic device, Electronic device.
- In claim 6, When the above instructions are executed by the at least one processor, Based on identifying another designated event for activating the first display while the electronic device is in the unfolded state, third data for indicating the state around the electronic device through the proximity sensor, and Based on the third data above, identify whether the electronic device is in the first state or in the second state, and After acquiring the third data above, activate the first display, and After activating the first display, stop acquiring the third data. causing the above electronic device, Electronic device.
- In claim 7, When the above instructions are executed by the at least one processor, Identifying a first difference between the third data and the first reference data indicating the first state and a second difference between the third data and the second reference data indicating the second state, and Based on the first difference and the second difference, to identify whether the electronic device is in the first state or in the second state, causing the above electronic device, Electronic device.
- In claim 8, When the above instructions are executed by the at least one processor, While providing a call with a user of an external electronic device through a call application, a fourth data is acquired to indicate the state of the surroundings of the electronic device through the proximity sensor, and Based on the above fourth data, to update the above first reference data and the above second reference data, Causing the above electronic device, The above third data is, Data obtained after updating the first reference data and the second reference data based on the fourth data above, Electronic device.
- In claim 8, When the above instructions are executed by the at least one processor, Based on identifying that the electronic device is within the first state based on the third data above, a visual object is displayed to guide that the reception of touch input to the first display is restricted through the first display, and Receiving user input to stop the display of the above visual object, Based on the reception of the above user input, to update the above first reference data and the above second reference data, causing the above electronic device, Electronic device.
- In claim 6, the proximity sensor is, In the above folding state, the second display is deactivated while being deactivated, Electronic device.
- In claim 1, the at least one processor is, It includes a first processor operatively coupled to the above-mentioned at least one inertial sensor and a second processor operatively coupled to the first processor, The above-mentioned first processor is, While the second processor is deactivated and the second display is deactivated within the folding state, the first data indicating the attitude of the electronic device and the second data indicating a change in the position of the electronic device are configured to be acquired through the at least one inertial sensor, and The second processor activated based on detecting the specified event above, Configured to activate the second display above, Electronic device.
- In claim 12, the first processor among the first processor and the second processor is, While the second processor is deactivated and the second display is deactivated within the folding state, the electronic device is configured to identify that it has moved to the designated posture or that it has moved to the posture distinct from the designated posture, based on the first data and the second data. Among the first processor and the second processor, the second processor is, Configured to change the time based on the reception of the touch input or to maintain the time independently of the reception of the touch input, Electronic device.
- In claim 1, the designated event is, at least one of receiving a designated touch input for the second display, receiving a press input for a physical button exposed through the first housing or a part of the second housing, or receiving a message from an external electronic device. Electronic device.
- In electronic devices, display; A proximity sensor included within the display or positioned below the display; At least one processor; and It includes memory for storing instructions, When the above instructions are executed by the at least one processor, Detecting an event to activate the above display, In response to detecting the above event, data indicating the state of the surroundings of the electronic device is obtained through the proximity sensor, and Based on identifying that the above data is within a first reference range, the electronic device is identified to be in a first state contained within an external object in an ungripped state, and Based on identifying that the above data is within a second reference range distinct from the first reference range, the electronic device is identified to be in a second state different from the first state, and Based on identifying that the above data is outside the first reference range and the second reference range: Identifying a first difference between the representative value of the first values within the first reference range and the data, and a second difference between the representative value of the second values within the second reference range and the data, Based on the first difference and the second difference, identify whether the electronic device is in the first state or in the second state, and Based on identifying that the electronic device is within the first state, a first screen is displayed indicating that touch input to the display is restricted through the display activated according to the event, and Based on identifying that the electronic device is in the second state, to display a second screen different from the first screen through the display activated according to the event, causing the above electronic device, Electronic device.
- In claim 15, When the above instructions are executed by the at least one processor, In response to activating the above display, to stop acquiring the data through the proximity sensor, causing the above electronic device, Electronic device.
- In claim 15, at least one of the first reference range and the second reference range is, Adjusted based on the above data, Electronic device.
- In claim 15, the data is, This is the first data, and When the above instructions are executed by the at least one processor, While providing a call with a user of an external electronic device through a call application, second data is acquired to indicate the state of the surroundings of the electronic device through the proximity sensor, and To adjust at least one of the first reference range and the second reference range based on the second data above, Causing the above electronic device, The weight applied to the second data to adjust at least one of the first reference range and the second reference range is, A weight greater than that applied to the first data to adjust at least one of the first reference range and the second reference range, Electronic device.
- delete
- In claim 15, It further includes at least one inertial sensor, and When the above instructions are executed by the at least one processor, Data indicating the attitude of the electronic device is obtained through the above at least one inertial sensor, and Further based on the data indicating the above posture, to identify whether the electronic device is in the first state or in the second state, causing the above electronic device, Electronic device.
Description
Electronic device for identifying state using a sensor The following descriptions relate to an electronic device for identifying a state using a sensor. The electronic device may include at least one sensor to identify the state of the electronic device or a state outside the electronic device. For example, the at least one sensor may include at least one inertial sensor used to identify the attitude of the electronic device or the movement of the electronic device. For example, the at least one sensor may include at least one illuminance sensor to identify the brightness around the electronic device or a proximity sensor to identify whether the electronic device is adjacent to an external object. FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments. FIG. 2 is a simplified block diagram of an electronic device according to various embodiments. FIG. 3 is a flowchart illustrating a method for identifying the state of an electronic device based on data obtained through an inertial sensor according to one embodiment. FIG. 4 illustrates an example of an unfolded state of an electronic device according to one embodiment. FIG. 5 illustrates an example of a folding state of an electronic device according to one embodiment. FIG. 6 illustrates an example of a specified state indicated by first data obtained according to one embodiment. FIG. 7 illustrates an example of a moving state of an electronic device indicated by second data obtained according to one embodiment. FIG. 8a illustrates an example of a method for identifying whether an electronic device is contained within an external object without being gripped, based on first data and second data, according to one embodiment. FIG. 8b is a timing diagram illustrating a method for controlling the time at which the second display (250) is activated according to a specified event. FIG. 9 is a flowchart illustrating a method for obtaining first data according to one embodiment. FIG. 10 is a flowchart illustrating a method for identifying the state of an electronic device using different sensors depending on whether the electronic device is in a folded state, according to one embodiment. FIG. 11 is a flowchart illustrating a method for identifying the state of an electronic device using a proximity sensor based on identifying an unfolded state according to one embodiment. FIG. 12 is a flowchart illustrating a method for identifying the state of an electronic device using a first processor while the second processor is deactivated, according to one embodiment. FIG. 13 is a flowchart illustrating a method for identifying the state of an electronic device based on data obtained through a proximity sensor, according to one embodiment. FIG. 14 illustrates an example of a screen including a visual object displayed based on identifying that an electronic device is contained within an external object in a state where it is not gripped, according to one embodiment. FIG. 15 is a flowchart illustrating a method for adjusting a first reference range and a second reference range according to one embodiment. FIG. 16 illustrates an example of a method for adjusting a representative value of first values within a first reference range and a representative value of second values within a second reference range, according to one embodiment. FIG. 17 illustrates an example of a method for dividing a first reference range into a first partial reference range and a second partial reference range, according to one embodiment. FIG. 1 is a block diagram of an electronic device (101) in a network environment (100) according to various embodiments. Referring to FIG. 1, in a network environment (100), an electronic device (101) may communicate with an electronic device (102) through a first network (198) (e.g., a short-range wireless communication network) or with at least one of an electronic device (104) or a server (108) through a second network (199) (e.g., a long-range wireless communication network). According to one embodiment, the electronic device (101) may communicate with the electronic device (104) through a server (108). According to one embodiment, the electronic device (101) may include a processor (120), memory (130), input module (150), sound output module (155), display module (160), audio module (170), sensor module (176), interface (177), connection terminal (178), haptic module (179), camera module (180), power management module (188), battery (189), communication module (190), subscriber identification module (196), or antenna module (197). In some embodiments, at least one of these components (e.g., connection terminal (178)) may be omitted from the electronic device (101), or one or more other components may be added. In some embodiments, some of these components (e.g., sensor module (176), camera module (180), or antenna module (197)) may be integrated into a single component (e.g., display module (160)). The processo