KR-102963764-B1 - MEDIA CONTENT PLAYER ON AN EYEWEAR DEVICE
Abstract
Systems and methods for performing operations are provided, the operations comprising: displaying a plurality of media content control options by one or more processors of an eyeglass device; detecting a first touch input by a touch input interface of an eyeglass device, comprising a single finger touching the touch input interface; based on detecting the first touch input, causing a cursor to track the first touch input and navigate a plurality of media content control options; based on detecting that the single finger has not released from touching the touch input interface, displaying a second media content control option related to the first media content control option; and performing a selection associated with the second media content control option based on the movement of the single finger along the touch input interface.
Inventors
- 굿리치, 카일
- 맥피, 앤드류 제임스
- 모레노, 다니엘
Assignees
- 스냅 인코포레이티드
Dates
- Publication Date
- 20260513
- Application Date
- 20211216
- Priority Date
- 20210824
Claims (20)
- As a method, A step of simultaneously displaying a video player presenting a plurality of media content control options and a video as an overlay on a real-world environment being viewed through the lenses of the eyewear device, by one or more processors of the eyewear device— wherein one or more of the plurality of media content control options correspond to adjusting the playback position of the video presented by the video player—; A step of detecting a first touch input including a single finger touching the touch input interface by means of the touch input interface of the above-mentioned eyeglass device; Based on detecting the first touch input including a long press operation, a step of triggering the display of a cursor and causing the cursor to track the first touch input to navigate the plurality of media content control options for the first media content control option; A step of determining that, after the single finger is initially detected by the touch input interface and while the single finger continues to be detected by the touch input interface, an additional finger is added to touch the touch input interface together with the single finger so that the touch input interface receives touch input by two fingers; and A method comprising the step of, in response to determining that an additional finger is added to touch the touch input interface together with the single finger after detecting the first touch input including the single finger, performing a first display operation including a selection associated with a second media content control option associated with the first media content control option, such that a first level of a set of menu options is navigated using one finger navigation and a second level of a set of menu options is navigated using two finger navigation.
- In paragraph 1, A step of displaying a second media content control option related to the first media content control option based on detecting that the single finger has not been released from touching the touch input interface; and A method further comprising the step of performing a second display operation including a selection associated with the second media content control option based on the movement of the single finger along the touch input interface.
- In paragraph 1, A step of determining that the first media content control option has been highlighted by the cursor for a threshold time period in response to detecting that the single finger continues to touch the touch input interface; and A method further comprising the step of selecting the first media content control option in response to determining that the first media content control option was highlighted by the cursor during the threshold time period.
- In paragraph 1, A method in which the above-described touch input interface is integrated into the frame of the above-described eyewear device.
- In paragraph 1, The first media content control option includes a volume adjustment option, and the step of displaying the second media content control option is: A step of displaying a slider for adjusting the volume of media content being played on the above-mentioned eyeglass device; and A method comprising the step of adjusting the volume by detecting the movement of the single finger along the touch input interface and changing the position of the slider.
- In paragraph 1, A step of removing the second media content control option from the display in response to detecting the release of the single finger from the touch input interface; and A method further comprising the step of displaying the plurality of media content control options again in response to detecting the release of the single finger from the touch input interface.
- In paragraph 1, The first media content control option includes a seek option, and the step of displaying the second media content control option is: A step of displaying a slider for adjusting the playback position of media content being played on the above-mentioned eyeglass device; and A method comprising the step of adjusting the playback position by detecting the movement of the single finger along the touch input interface and changing the position of the slider.
- In paragraph 1, The first media content control option includes a series selection option, and the step of displaying the second media content control option is: A step of displaying a plurality of series identifiers; and A method comprising the step of highlighting a given series identifier among a plurality of series identifiers by detecting the movement of a single finger along the touch input interface and moving the cursor.
- In paragraph 8, A step of displaying a plurality of episode identifiers in response to determining that the cursor has highlighted the given series identifier during a threshold time period; and A method further comprising the step of moving the cursor based on the detected movement of the single finger according to the touch input interface to highlight a given episode identifier among the plurality of episode identifiers.
- In Paragraph 9, A method further comprising the step of playing an episode of a series corresponding to the given episode identifier in response to determining that the single finger is released from the touch input interface.
- In Paragraph 10, A method further comprising the steps of detecting the movement of two fingers along the touch input interface and causing the cursor to move based on the detected movement of the two fingers along the touch input to highlight a series identifier among a plurality of series identifiers.
- In paragraph 1, A step of selecting between a navigation of a first series of options and a navigation of a second series of options based on whether the single finger or two fingers are dragged along the touch input interface; A step of detecting continuous contact between the single finger and the frame of the eyeglass device while different levels of a displayed playback control menu, including volume control options and series selection options, are being navigated; A step of determining that the single finger is dragged to highlight the above volume control option; A step of determining whether the above volume control option is highlighted during a threshold time period; A step of replacing the display of the playback control menu with a volume slider in response to determining that the volume control option is highlighted during the threshold time period; A step of adjusting the volume corresponding to the volume control level of the playback control menu using the single finger by dragging the single finger along the volume slider to a desired volume; After adjusting the volume using the single finger and the volume slider, a step of detecting the additional finger on the frame of the eyeglass device; and A method further comprising the step of, in response to detecting an additional finger on the frame of the eyewear device after adjusting the volume by dragging the single finger along the volume slider, displaying the playback control menu, including the volume control option and the series selection option, back at the position of the volume slider.
- In paragraph 1, A method in which a playback control menu initially navigated using the single finger is now navigated by dragging the two fingers along the frame of the eyeglass device.
- As an eyeglasses device, Touch input interface; A storage device containing instructions; and It includes at least one processor configured to execute instructions to perform operations, and The above operations are: A step of simultaneously displaying a video player presenting multiple media content control options and a video as an overlay on a real-world environment being viewed through the lenses of the eyeglass device— wherein one or more of the multiple media content control options correspond to adjusting the playback position of the video presented by the video player—; A step of detecting a first touch input including a single finger touching the touch input interface by means of the touch input interface of the above-mentioned eyeglass device; Based on detecting the first touch input including a long press operation, a step of triggering the display of a cursor and causing the cursor to track the first touch input to navigate the plurality of media content control options for the first media content control option; A step of determining that, after the single finger is initially detected by the touch input interface and while the single finger continues to be detected by the touch input interface, an additional finger is added to touch the touch input interface together with the single finger so that the touch input interface receives touch input by two fingers; and An eyeglasses device comprising the step of, in response to determining that an additional finger is added to touch the touch input interface together with the single finger after detecting the first touch input including the single finger, performing a first display operation including a selection associated with a second media content control option associated with the first media content control option, such that a first level of a set of menu options is navigated using one finger navigation and a second level of a set of menu options is navigated using two finger navigation.
- In Paragraph 14, The above operations are, A step of displaying a second media content control option related to the first media content control option based on detecting that the single finger has not been released from touching the touch input interface; and An eyeglasses device further comprising the step of performing a second display operation including a selection associated with the second media content control option based on the movement of the single finger along the touch input interface.
- In Paragraph 14, The above operations are, A step of determining that the first media content control option has been highlighted by the cursor for a threshold time period in response to detecting that the single finger continues to touch the touch input interface; and An eyewear device further comprising the step of selecting the first media content control option in response to determining that the first media content control option was highlighted by the cursor during the threshold time period.
- In Paragraph 14, The first media content control option includes a search option, and the step of displaying the second media content control option is: A step of displaying a slider for adjusting the playback position of media content being played on the above-mentioned eyeglass device; and An eyewear device comprising the step of adjusting the playback position by detecting the movement of the single finger along the touch input interface and changing the position of the slider.
- A non-transient machine-readable storage medium comprising instructions that cause the eyeglass device to perform operations when executed by one or more processors of the eyeglass device, The above operations are: A step of simultaneously displaying a video player presenting multiple media content control options and a video as an overlay on a real-world environment being viewed through the lenses of the eyeglass device— wherein one or more of the multiple media content control options correspond to adjusting the playback position of the video presented by the video player—; A step of detecting a first touch input including a single finger touching the touch input interface by means of the touch input interface of the above-mentioned eyeglass device; Based on detecting the first touch input including a long press operation, a step of triggering the display of a cursor and causing the cursor to track the first touch input to navigate the plurality of media content control options for the first media content control option; A step of determining that, after the single finger is initially detected by the touch input interface and while the single finger continues to be detected by the touch input interface, an additional finger is added to touch the touch input interface together with the single finger so that the touch input interface receives touch input by two fingers; and A non-transient machine-readable storage medium comprising the step of, in response to determining that an additional finger is added to touch the touch input interface together with the single finger after detecting the first touch input including the single finger, performing a first display operation including a selection associated with a second media content control option associated with the first media content control option, such that a first level of a set of menu options is navigated using one finger navigation and a second level of a set of menu options is navigated using two finger navigation.
- In Paragraph 18, The above operations are, A step of displaying a second media content control option related to the first media content control option based on detecting that the single finger has not been released from touching the touch input interface; and A non-transient machine-readable storage medium further comprising the step of performing a second display operation including a selection associated with the second media content control option based on the movement of the single finger along the touch input interface.
- In Paragraph 18, The above operations are, A step of determining that the first media content control option has been highlighted by the cursor for a threshold time period in response to detecting that the single finger continues to touch the touch input interface; and A non-transient machine-readable storage medium further comprising the step of selecting the first media content control option in response to determining that the first media content control option was highlighted by the cursor during the threshold time period.
Description
Media Content Player on an Eyewear Device This application claims the benefit of priority to U.S. Provisional Application No. 63/129,344 filed December 22, 2020 and U.S. Patent Application No. 17/410,814 filed August 24, 2021, each of which is incorporated herein by reference in its entirety. This application relates to eyeglass devices. Some electronics-enabled eyewear devices, such as so-called smart glasses, allow users to interact with virtual content while engaging in certain activities. Users wear these eyewear devices and can view the real-world environment through them while interacting with virtual content displayed by the devices. The various drawings in the attached drawings represent exemplary embodiments of the present disclosure and should not be construed as limiting the scope thereof. FIG. 1 is a schematic representation of a networked environment in which the present disclosure may be arranged, according to some examples. Figure 2 is a schematic representation of a messaging system having both client-side and server-side functionality according to some examples. Figure 3 is a schematic representation of a data structure as maintained in a database, according to some examples. Figure 4 is a schematic representation of a message according to some examples. FIG. 5 is a perspective view of an eyeglasses device according to an exemplary embodiment. FIG. 6 is a flowchart illustrating exemplary operations of a media content control interface system according to an exemplary embodiment. FIGS. 7 through 9 are exemplary screens of a graphical user interface for a media content control interface system according to an exemplary embodiment. FIG. 10 is a schematic representation of a machine having the form of a computer system in which a set of instructions can be executed therein to cause the machine to perform any one or more methodologies discussed in this specification, according to some examples. FIG. 11 is a block diagram illustrating a software architecture in which examples can be implemented. The following description discusses exemplary embodiments of the present disclosure. In the following description, for illustrative purposes, many specific details are presented to provide an understanding of various embodiments of the disclosed subject matter. However, it will be apparent to those skilled in the art that embodiments of the disclosed subject matter can be put into practice without these specific details. In general, known instruction instances, protocols, structures, and technologies are not necessarily illustrated in detail. Typical smart glasses platforms allow users to interact with various types of virtual content. These platforms are configured to display virtual content on the lenses of the smart glasses. Interactions with this virtual content are generally limited to single button selections or voice navigation due to the limited amount of user input interfaces available on the smart glasses. Specifically, these smart glasses may include only a single touch input interface. While these systems generally function well to enable users to interact with virtual content, these devices prevent users from navigating complex menu structures. This limits the amount of content that a user can navigate and the types of interactions that a user can perform. The disclosed embodiments improve the efficiency of using an electronic device by providing a system that utilizes multiple types of user inputs to enable smooth and rapid navigation through a complex menu hierarchy to control content playback on an eyewear device. Specifically, according to the disclosed techniques, multiple media content control options are displayed by one or more processors of the eyewear device. The disclosed embodiments detect a first touch input, comprising a single finger touching the touch input interface, by means of a touch input interface of the eyewear device. The disclosed embodiments allow navigation via a cursor through multiple media content control options based on the first touch input to select the first media content control option among the multiple media content control options, and to display a second media content control option associated with the first media content control option in response to the first touch input. The disclosed embodiments perform a selection associated with the second media content control option based on detecting the movement of a single finger along the touch input interface. The disclosed embodiments increase the efficiencies of electronic devices by reducing the number of pages of information and inputs required to accomplish a task. The disclosed embodiments further increase the efficiency, attractiveness, and usefulness of electronic eyewear devices. Networked computing environment FIG. 1 is a block diagram illustrating an exemplary messaging system (100) for exchanging data (e.g., messages and associated content) over a network. The messaging system (100) includes mul