Search

US-12625561-B2 - Wearable bands and electromyographic gesture detection for command and control of wearable computing devices

US12625561B2US 12625561 B2US12625561 B2US 12625561B2US-12625561-B2

Abstract

An electromyographic gesture detection system for wearable devices, useful for entering text, selecting, controlling and manipulating virtual objects and smart applications on a display using gestures of the user. A wearable band is adapted to allow electromyographic detection and classification of specific gestures. The wearable device in some embodiments may also be adapted to recognize certain characteristics of the gestures, such as duration or amplitude, to allow enhanced navigation of a computer interface.

Inventors

  • Maximilian Ralph Peter von und zu Liechtenstein

Assignees

  • Maximilian Ralph Peter von und zu Liechtenstein

Dates

Publication Date
20260512
Application Date
20241028

Claims (5)

  1. 1 . A wearable electromyographic input system comprising: a wearable assembly configured to be worn on a user; a plurality of needle-less surface electrodes including at least one reference electrode and a plurality of signal electrodes, the wearable assembly configured to provide dry, non-adhesive touch-based contact between the electrodes and the user's skin when the wearable assembly is worn, the electrodes configured to measure myoelectric voltage differentials; and a processor operatively coupled to the electrodes; wherein the processor is configured to: (i) determine, from the myoelectric voltage differentials, an amplitude value indicative of strength of muscle activation; (ii) compute a control-effect magnitude by applying a nonlinear mapping function to the amplitude value; and (iii) determine, based on the myoelectric voltage differentials, a direction of a control effect selecting between a first direction and an opposite second direction of a control axis, whereby a virtual object is manipulated in accordance with the control-effect magnitude and the selected direction.
  2. 2 . The system of claim 1 , wherein the control-effect magnitude controls a rate of change of a parameter of the virtual object rather than an absolute value of the parameter.
  3. 3 . The wearable electromyographic input system of claim 1 , wherein the control effect comprises applying a simulated physical force or torque to the virtual object.
  4. 4 . The wearable electromyographic input system of claim 1 , wherein the control effect is a vector quantity comprising the control-effect magnitude and the selected direction.
  5. 5 . The system of claim 1 , wherein the nonlinear mapping function comprises an exponential function of the amplitude value.

Description

CROSS REFERENCE TO RELATED APPLICATIONS This application is a continuation of application Ser. No. 18/772,132 filed on Jul. 13, 2024, which is a continuation of application Ser. No. 18/096,820 filed Jan. 13, 2023, which is a continuation of application Ser. No. 17/723,513 filed Apr. 19, 2022, which is a continuation of application Ser. No. 16/181,475 filed Nov. 6, 2018, which is a continuation in part of application Ser. No. 14/993,134, filed Jan. 12, 2016, which claims the benefit of U.S. Provisional Application No. 62/102,235, filed Jan. 12, 2015. STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT Not applicable to this application. BACKGROUND Field Example embodiments in general relate to a wink gesture based control system for entering text, selecting, controlling and manipulating virtual objects and smart applications on a head mounted display using wink gestures and/or facial expressions of the user. Related Art Any discussion of the related art throughout the specification should in no way be considered as an admission that such related art is widely known or forms part of common general knowledge in the field. Digital devices are prevalent throughout the world. For example, mobile smart phones, smart glasses, smart watches, and the like are becoming more and more popular as their functionalities are improved. In some areas of the world, it is rare for an individual to leave the house without relying in some way on such a digital device. This is particularly true in a wide range of industries, where digital automation has resulted in numerous body-worn devices which may control other units from a remote location. While such digital devices provide a range of functionalities, they typically suffer from the shortcoming that they require the use of hand (by touch or gesture) or speech inputs for operation. For example, smart phones typically include a touchscreen which is utilized to enter nearly all input instructions or to navigate a user interface. This can be particularly burdensome for those without full use of their hands, either due to medical issues such as disabilities or due to the nature of the work being performed. For example, a surgeon will often desire to utilize such a digital device, but would be limited if performing a task that requires both hands, such as surgery. Speech input has recently also gained popularity in controlling digital devices, however there are a host of real-world applications where both speech input and hand gesture input is impossible or undesirable, because the surrounding environment is either too noisy or requires secrecy, which by elimination only leaves input by wink gestures as a practical mode for controlling the user interface of such devices. SUMMARY This disclosure provides an apparatus in the form of a wearable computing device, capable of electromyographic gesture detection and gesture based inputs to a user interface. The device comprises a wearable band which is adapted for detection of gestures by electromyographic means, using sensors which are in direct but non-adhesive skin contact when the band is worn. The apparatus is capable of parametrizing the gestures which are being detected, in particular in terms of an amplitude and a duration. The device is further capable of interpreting the gestures in terms of translating a gesture into a corresponding input on a user interface of a wearable computing device. There has thus been outlined, rather broadly, some of the features of the gesture based control technique in order that the detailed description thereof may be better understood, and in order that the present contribution to the art may be better appreciated. There are additional features of the gesture based control technique that will be described hereinafter and that will form the subject matter of the claims appended hereto. In this respect, before explaining at least one embodiment of the gesture based control technique in detail, it is to be understood that the gesture based control technique is not limited in its application to the details of construction or to the arrangements of the components set forth in the following description or illustrated in the drawings. The wink gesture based control technique is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of the description and should not be regarded as limiting. BRIEF DESCRIPTION OF THE DRAWINGS Example embodiments will become more fully understood from the detailed description given herein below and the accompanying drawings, wherein like elements are represented by like reference characters, which are given by way of illustration only and thus are not limitative of the example embodiments herein. FIG. 1A is a first perspective view of a wink gesture based control system in accordance with an example embodiment. FIG. 1B is a second pers