EP-4097572-B1 - A SYSTEM FOR BIOPOTENTIAL-BASED GESTURE CONTROL
Inventors
- Cipoletta, David O.
- ANG, Dexter W.
Dates
- Publication Date
- 20260506
- Application Date
- 20210122
Claims (11)
- A system for biopotential-based gesture control, the system comprising: a biopotential sensor (106), the biopotential sensor configured to be worn by a person at a wrist and to sense biopotentials at the wrist; and a wrist location sensor (108), the wrist location sensor configured to be worn at the wrist and to provide wrist location data indicative of a geographic location of the wrist and a pointing direction of the wrist; wherein the system is configured to: based on at least the wrist location data from the wrist location sensor, determine a pointing vector (1002); based on the pointing vector and a three dimensional surface, select an object (1004); based on at least the biopotentials sensed by the biopotential sensor, detect a gesture indicating an intention to perform an operation; and based on the selected object (1004) and the detected gesture, perform the operation with respect to the selected object (1004), the selected object being indicated by an intersection of the pointing vector and the three dimensional surface.
- The system of claim 1, wherein the system is further configured to: after performing the operation with respect to the object, control the object; wherein the step of controlling the object comprises: based on at least the wrist location data from the wrist location sensor and the biopotentials sensed by the biopotential sensor, determining one or more commands; transmitting the one or more commands to the object.
- The system of claim 2, wherein the system is further configured to: initiate a mode in which the step of controlling the object may be performed; wherein initiating the mode in which the step of controlling the object may be performed comprises: based on one or more of (i) the wrist location data from the wrist location sensor and (ii) the biopotentials sensed by the biopotential sensor, detecting a wake word gesture, the wake word gesturing indicating an intention to initiate the mode in which the step of controlling the object may be performed.
- The system of any of claims 1-3, wherein the step of determining the pointing vector intended by the person is performed based on both: the wrist location data from the wrist location sensor; and data indicating a location of the person.
- The system of claim 4, wherein the data indicating the location of the person is global positioning system (GPS) data.
- The system of claim 4, wherein the data indicating the location of the person is based on triangulation using directions from multiple reference points.
- The system of any of claim 4-6, wherein the step of selecting the object is based on at least: the pointing vector based on (i) the wrist location data from the wrist location sensor and (ii) the data indicating a location of the person; the detected gesture based on at least the biopotentials sensed by the biopotential sensor; and context data comprising information related to the object.
- The system of claim 7, wherein the context data comprises a location of the object.
- The system of claim 7, wherein the context data comprises image data or video data.
- The system of any of claims 1-9, wherein the system comprises: a user interface device (100) that is configured to be worn at the wrist of the person, the biopotential sensor (106) and the wrist location sensor (108) being included in the user interface device (100); and a responsive device (200), the responsive device (200) comprising a processor that is configured to perform the steps of: (i) determining the pointing vector (1002) intended by the person, (ii) detecting the gesture indicating the intention to perform the operation; and (iii) performing the operation with respect to the object (1004).
- The system of any of claims 1-9, wherein the system comprises a user interface device (100) that is configured to be worn at the wrist of the person, the biopotential sensor (106) and the wrist location sensor (108) being included in the user interface device, and the user interface device (100) comprises a processor that is configured to perform the steps of: (i) determining the pointing vector (1002) intended by the person, (ii) detecting the gesture indicating the intention to perform the operation; and (iii) performing the operation with respect to the object (1004).
Description
CROSS-REFERENCE TO RELATED APPLICATIONS This application claims priority to U.S. Patent Application No. 16/890,507, filed June 2, 2020, published as US20210232226A1, which is a continuation-in-part of U.S. Patent Application serial number 16/774,825, filed January 28, 2020, published as US20210232224A1. This application is related to U.S. Patent Application serial number 16/104,273, filed August 17, 2018, pending, which has been published as US20190082996A1 and which is a continuation of U.S. Patent Application serial number 15/826,131, now U.S. Patent 10,070,799 issued September 11, 2018, which is a nonprovisional application of U.S. provisional patent application 62/566,674, filed October 7, 2017, and U.S. provisional patent application 62/429,334, filed December 2, 2016. This application also is related to U.S. Patent application serial number 16/055,123, filed August 5, 2018, published as US20200042087A1. This application also is related to U.S. Patent application serial number 16/246,964, filed January 14, 2019, published as US20200042089A1. This application also is related to PCT Patent application serial number PCT/US19/061421, filed November 4, 2019, pending, which is an international application designating the United States claiming priority to U.S. Patent Application serial number 16/196,462, filed November 20, 2018, published as US20200159321A1. This application also is related to U.S. Patent application serial number 16/737,252, filed January 8, 2020, published as US20210210114A1. BACKGROUND Most machines have a form of "user interface" through which a person interacts with the machine. The person provides inputs through one or more devices from which the machine interprets the person's intent. The machine provides feedback to the person in response to those inputs, such as by the behavior of the machine or by outputs through one or more devices which present information to the person. When the machine is or includes a computer system with a display, a common paradigm for the user interface is a "graphical user interface". With a graphical user interface, the person manipulates a user interface device which provides input to an application running a computer. In turn, the computer provides visual output on a display. The computer may provide other outputs, such as audio. Through the user interface device, the person can control a position of a graphical object, typically called a cursor, within a display, and can indicate an action to be performed. The action to be performed typically is based in part on the position of that graphical object within the information presented on the display. A variety of user interface devices have been created for computers. Likewise, many kinds of graphical user interfaces and other kinds of user interface paradigms have been used with computers. US2017090555 discloses a device comprising: a biopotential sensor, the biopotential sensor configured to be worn by a person at a wrist and to sense biopotentials at the wrist; and a wrist location sensor, the wrist location sensor configured to be worn at the wrist and to provide wrist location data indicative of an orientation of the wrist; wherein the device is configured to: based on the orientation and the sensed biopotentials, detect a gesture indicating an intention to select or move an object. US2016/313801 discloses a gesture controlled interface, which may use a number of different signals including surface nerve conduction signals to determine gestures. The gestures may be used to control another device such as a radio in a car. WO2013/049248 discloses a wrist worn device that is responsive to user gestures enabling a user to navigate displayed images projected to an eyepiece. US2019/155375 discloses a device for displaying an obstructed target by receiving information from another device having a view of the target. US2018/317770 discloses a wearable device that can recognise a gesture allowing for interaction with augmented and virtual reality systems. SUMMARY This Summary introduces a selection of concepts in simplified form that are described further below in the Detailed Description. This Summary neither identifies features as key or essential, nor limits the scope, of the claimed subject matter. Various respective aspects are set out in the appended claims, which define the scope of protection of the present invention. The graphical user interface paradigm for interacting with a machine or computer can be difficult to use in some environments. In some environments, a computing device may have a small display or no display at all. Further, in some environments, users may not be able to easily manipulate any input devices, because they are grasping or holding another object. For a variety of reasons, in some environments, the typical combination of an input device and a display does not provide a sufficiently intuitive, workable, convenient, or safe form of interaction with a machine. Further, most graph