Search

KR-20260065439-A - ELECTRONIC DEVICE, METHOD, AND NON-TRANSITORY COMPUTER READABLE STORAGE MEDIUM FOR CONTROLLING SCREEN DISPLAYED THROUGH DISPLAY

KR20260065439AKR 20260065439 AKR20260065439 AKR 20260065439AKR-20260065439-A

Abstract

A wearable device may include at least one processor and at least one camera. The at least one processor identifies whether the eyes of a user wearing the wearable device blink within a specified time interval using the at least one camera, and, upon identifying that the user's eyes do not blink within the specified time interval, displays a first space between the first line and the top boundary based on a second brightness lower than the first brightness while a first line is displayed that causes movement from the top boundary of the screen displayed through the display toward the top boundary, and displays a second space between the second line and the bottom boundary based on the second brightness lower than the first brightness while a second line is displayed that causes movement from the bottom boundary of the screen displayed through the display toward the top boundary, and is configured to display the screen displayed through the display based on the first brightness upon identifying that the first line and the second line are in contact.

Inventors

  • 정진교
  • 현은정
  • 전우람

Assignees

  • 삼성전자주식회사

Dates

Publication Date
20260508
Application Date
20241118
Priority Date
20241101

Claims (20)

  1. In a wearable device, At least one camera; display; Memory for storing instructions and including one or more storage media; and It includes at least one processor comprising a processing circuit, and When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Using at least one camera, identify whether the eye of a user wearing the wearable device blinks within a specified time interval, and Based on the identification that the user's eye does not blink within the specified time interval: While a first line is displayed that causes movement from the top boundary to the bottom boundary of the screen displayed through the above display, a first space between the first line and the top boundary is displayed based on a second brightness lower than a first brightness, and While a second line is displayed that causes movement from the bottom boundary of the screen displayed through the display toward the top boundary, a second space between the second line and the bottom boundary is displayed based on the second brightness which is lower than the first brightness, and Causing the screen displayed through the display to be displayed based on the first brightness, according to the identification that the first line and the second line are in contact. Wearable device.
  2. In paragraph 1, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, While the first line and the second line are moving, the screen displayed through the display is displayed based on a blur effect, and Causing to display the third space between the first line and the second line based on the first brightness, Wearable device.
  3. In paragraph 1, It includes at least one additional sensor, and When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Causing to determine the first brightness based on the brightness outside the wearable device identified by the at least one sensor. Wearable device.
  4. In paragraph 1, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Based on the type of object displayed through the above display, the eye movement identified using the at least one camera, the type of application executed by the wearable device, and whether user input is acquired, identifying whether the current situation corresponds to a specified situation, and Causing to refrain from displaying the first space and the second space based on the second brightness lower than the first brightness, in accordance with the identification that the above current situation corresponds to the above specified situation. Wearable device.
  5. In paragraph 1, The above first line and the above second line are displayed during a first designated time interval, and When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Causing to display a progress bar for the first designated time interval on at least one part of the screen displayed through the display. Wearable device.
  6. In paragraph 1, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Using the above at least one camera, it is determined whether the amount of change in the pupil size of the eye is less than a threshold value within a second designated time interval, and Based on the identification that the change in the above pupil size is less than the above threshold value: Reduce the size of the content displayed on the screen above, and Based on the identification that the amount of change in the pupil size corresponds to a specified value while the size of the content decreases from a first size to a second size, the content is displayed based on the second size during a third specified time interval, and Causing to display the content based on the first size after the third designated time interval above, Wearable device.
  7. In paragraph 6, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Based on the content reduced to the second size mentioned above, content for the space excluding the content within the screen is generated using an artificial intelligence model, and Causing the above-mentioned generated content to be displayed through the above-mentioned display, Wearable device.
  8. In paragraph 6, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Identifying whether the current situation corresponds to a specified situation based on whether at least one object within the screen displayed through the above display is moving, whether the user wearing the wearable device is moving, and whether user input is obtained, and Causing to refrain from reducing the size of the content displayed on the screen, based on the identification that the above current situation corresponds to the above specified situation, Wearable device.
  9. In paragraph 1, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Identifying whether the position on the screen of the user's gaze, identified using at least one camera, is located within a threshold range during a fourth designated time interval, and Based on the identification that the position of the user's gaze on the screen is located within the threshold range during the fourth designated time interval, a first point is displayed at a first position on the screen to guide the user's gaze, and Causing to display a second point to guide the user's gaze to a second position on the screen, based on the identification that the position of the user's gaze on the screen is located at the first point. Wearable device.
  10. In paragraph 1, When the above instructions are executed individually or collectively by the at least one processor, the wearable device, Using the above at least one camera, identify whether the user's eye is bloodshot, and, Causing the screen displayed through the display during a fifth designated time interval to be displayed based on a brightness lower than the first brightness, upon identification that the user's eye is bloodshot. Wearable device.
  11. A method performed by a wearable device comprising at least one sensor, at least one camera, and a display, An operation of identifying whether the eye of a user wearing the wearable device blinks within a specified time interval using at least one camera; Based on the identification that the user's eye does not blink within the specified time interval: An operation of displaying a first space between the first line and the top boundary based on a second brightness lower than a first brightness while a first line is displayed that causes movement from the top boundary to the bottom boundary of the screen displayed through the above display; An operation of displaying a second space between the second line and the bottom boundary based on the second brightness lower than the first brightness while a second line is displayed that causes the second line to move from the bottom boundary toward the top boundary of the screen displayed through the display; and Based on the identification that the first line and the second line are in contact, the operation of displaying the screen displayed through the display based on the first brightness, method.
  12. In Paragraph 11, An operation of displaying a screen displayed through the display based on a blur effect while the first line and the second line are moving; and Further including the operation of displaying a third space between the first line and the second line based on a first brightness. method.
  13. In Paragraph 11, The method further includes the operation of determining the first brightness based on the brightness outside the wearable device identified by the at least one sensor. method.
  14. In Paragraph 11, An operation to identify whether the current situation corresponds to a specified situation based on the type of object displayed through the display, the movement of the eye identified using the at least one camera, the type of application executed by the wearable device, and whether user input is acquired; and A further operation of refraining from displaying the first space and the second space based on the second brightness lower than the first brightness, upon identification that the above current situation corresponds to the above specified situation. method.
  15. In Paragraph 11, The above first line and the above second line are displayed during a first designated time interval, and The above method is, The method further includes the operation of displaying a progress bar for the first designated time interval on at least one part of the screen displayed through the display. method.
  16. In Paragraph 11, An operation to identify whether the amount of change in the pupil size of the eye is less than a threshold value within a second designated time interval using at least one camera; Based on the identification that the change in the above pupil size is less than the above threshold value: An action to reduce the size of the content displayed on the screen above; An operation of displaying the content based on the second size during a third specified time interval, in accordance with the identification that the amount of change in the pupil size corresponds to a specified value while the size of the content is reduced from a first size to a second size; and After the third designated time interval, further including the operation of displaying the content based on the first size, method.
  17. In Paragraph 16, An operation to generate content for the space excluding the content among the screens using an artificial intelligence model based on the content reduced to the second size above; and Further including the operation of displaying the generated content through the display. method.
  18. In Paragraph 16, An operation to identify whether the current situation corresponds to a specified situation based on whether at least one object within a screen displayed through the display moves, whether the user wearing the wearable device moves, and whether user input is obtained; and A further operation of refraining from reducing the size of the content displayed on the screen, based on the identification that the above current situation corresponds to the above specified situation. method.
  19. In Paragraph 11, An operation to identify whether the position of the user's gaze on the screen, identified using at least one camera, is located within a threshold range during a fourth designated time interval; An operation to display a first point for guiding the user's gaze at a first position on the screen, based on the identification that the position of the user's gaze on the screen is located within the threshold range during the fourth designated time interval; and A method further comprising, upon identifying that the position of the user's gaze on the screen is located at the first point, a second point for guiding the user's gaze to a second position on the screen. method.
  20. In Paragraph 11, An operation to identify whether the user's eye is bloodshot using at least one camera; and Further including the operation of displaying the screen displayed through the display during a fifth designated time interval based on a brightness lower than the first brightness, upon identification that the user's eye is bloodshot. method.

Description

Electronic device, method, and non-transient computer-readable storage medium for controlling a screen displayed through a display The following descriptions relate to an electronic device, a method, and a non-transient computer-readable storage medium for controlling a screen displayed through a display. A wearable device may be a head-mounted device (HMD) that can be worn on a user's head. The distance between the screen displayed on the wearable device's display and the eyes of the user wearing the wearable device may be relatively close. Because the distance between the screen and the user's eyes is relatively close, the use of the wearable device may cause damage to the user's eyes. The information described above may be provided as related art for the purpose of aiding understanding of the present disclosure. No claim or determination is made as to whether any of the foregoing may be applied as prior art related to the present disclosure. In relation to the description of the drawings, the same or similar reference numerals may be used for identical or similar components. FIG. 1 is a block diagram of an electronic device in a network environment according to various embodiments. FIG. 2a illustrates an example of a perspective view of a wearable device. FIG. 2b illustrates an example of one or more hardware components placed within a wearable device. FIGS. 3A and FIGS. 3B illustrate an example of the appearance of a wearable device. Figure 4 illustrates an example of a block diagram of a wearable device. Figure 5 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. FIG. 6 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. Figure 7 illustrates examples of screens designed to induce a user to blink. FIG. 8 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. FIG. 9 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. FIG. 10a illustrates examples of screens designed to induce a change in the user's pupil size. FIG. 10b illustrates examples of screens designed to induce a change in the user's pupil size. FIG. 11 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. FIG. 12 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. Figure 13 illustrates examples of screens designed to induce eye movements of a user. FIG. 14 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. Figure 15 illustrates examples of screens for reducing eye strain in users. FIG. 16 is a flowchart illustrating the operations of a wearable device for controlling a screen displayed through a display. The terms used in this disclosure are used merely to describe specific embodiments and are not intended to limit the scope of various embodiments. A singular expression may include a plural expression unless the context clearly indicates otherwise. Terms used herein, including technical or scientific terms, may have the same meaning as generally understood by those skilled in the art described in this disclosure. Terms used in this disclosure that are defined in a general dictionary may be interpreted as having the same or similar meaning as they have in the context of the relevant technology, and are not to be interpreted in an ideal or overly formal sense unless explicitly defined in this disclosure. In some cases, even terms defined in this disclosure are not to be interpreted to exclude the embodiments of this disclosure. In the various embodiments of the present disclosure described below, a hardware-based approach is described as an example. However, since the various embodiments of the present disclosure include techniques using both hardware and software, the various embodiments of the present disclosure do not exclude a software-based approach. Terms used in the following description to refer to images (e.g., image, frame, camera frame, captured image, camera image), terms referring to a user's hand (e.g., hand object, candidate object, hand candidate object, bounding box, candidate hand object), terms referring to signals (e.g., signaling, control signal, data, control data, request signal, information), terms referring to locations (e.g., location information, area information, object information, object location, object coordinates, reference object, coordinate information, location, coordinate, relative coordinate, absolute coordinate, coordinate system), terms referring to values (e.g., threshold value, reference value, reference area, reference range, level, threshold level, threshold, range, value, area), terms for operation states (e.g., step, operation, procedure), or terms ref