Search

CN-115346238-B - Gesture recognition method and device and computer equipment

CN115346238BCN 115346238 BCN115346238 BCN 115346238BCN-115346238-B

Abstract

The disclosure relates to a gesture recognition method, a gesture recognition device and computer equipment. The method comprises the steps of acquiring space coordinates of joints in a hand to be identified under the condition of triggering gesture identification, calculating a first included angle and a projection distance between a fingertip joint and a palm joint of each finger in the hand to be identified according to a pre-established projection coordinate system and the space coordinates, and determining the bending degree of each finger according to the first included angle and the projection distance, wherein the projection coordinate system is established according to the palm joint and the wrist joint in the hand to be identified, and determining gesture identification results of the hand to be identified according to the bending degree of each finger. By adopting the method, various complex gestures can be identified without a large amount of calculation.

Inventors

  • ZHAO ZHILIANG
  • Cao Genyin
  • Jun Can
  • WANG HONGFEI
  • JIANG MINGWU

Assignees

  • 苏州光格科技股份有限公司

Dates

Publication Date
20260508
Application Date
20220726

Claims (10)

  1. 1. A method of gesture recognition, the method comprising: under the condition of triggering gesture recognition, acquiring the space coordinates of each joint in the hand to be recognized; Calculating a first included angle and a projection distance between a fingertip joint and a palm joint of each finger in a hand to be identified according to a pre-established projection coordinate system and the space coordinate, and determining the bending degree of each finger according to the first included angle and the projection distance, wherein the projection coordinate system is established according to the palm joint and the wrist joint in the hand to be identified and is used for determining the relative position relationship between the fingertip joint and the palm joint of each finger; determining a gesture recognition result of the hand to be recognized according to the curvature of each finger; The projection coordinate system establishing process comprises the steps of determining a coordinate system origin according to a palm joint in the hand to be identified, determining a first coordinate axis according to a first direction from a wrist joint to the palm joint in the hand to be identified, determining a second coordinate axis according to a straight line perpendicular to the first coordinate axis, and establishing a projection coordinate system according to the coordinate system origin, the first coordinate axis and the second coordinate axis.
  2. 2. The method of claim 1, wherein the projection distance comprises a first projection distance and a second projection distance, and wherein the calculating the first angle and the projection distance between the fingertip joint and the metacarpal joint of each finger in the hand to be identified comprises: According to the space coordinates of the fingertip joints and the space coordinates of the palm joints of each finger, a first projection distance, a second projection distance and a first included angle are obtained; The first projection distance comprises the distance between the fingertip joint of each finger and a first coordinate axis in the projection coordinate system; the second projection distance comprises the distance between the fingertip joint of each finger and a second coordinate axis in the projection coordinate system; the first included angle comprises an included angle between a first vector and the second coordinate axis, wherein the first vector is a vector formed by a fingertip joint and the palm joint of each finger.
  3. 3. The method of claim 1, wherein after the obtaining the spatial coordinates of each joint in the hand to be identified, the method further comprises: Determining the flexion and extension states of all fingers and/or the opening and closing states among all fingers in the hand to be identified according to the space coordinates of all joints; determining a gesture recognition result of the hand to be recognized according to the bending and stretching states of the fingers and/or the opening and closing states of the fingers, or determining the gesture recognition result of the hand to be recognized according to the bending and stretching states of the fingers and/or the opening and closing states of the fingers and the bending degree; the bending and stretching states comprise a finger stretching state, a finger curling state, an intermediate state of stretching and curling and an unrecognizable state, and the opening and closing states comprise a closing state, a non-closing state and an unrecognizable state.
  4. 4. A method according to claim 3, wherein said determining the flexion and extension state of each finger in the hand to be identified comprises: determining joint vectors according to the space coordinates of adjacent joints in each finger; Calculating a sum of dot products between the joint vectors in each finger; and determining the flexion and extension states of the fingers according to the sum of the dot products and a preset range interval.
  5. 5. A method according to claim 3, wherein determining the open and close state between the respective fingers comprises: determining an identification vector corresponding to each finger according to the space coordinates of the wrist joint in the hand to be identified and the space coordinates of the middle joint in each finger; calculating a second included angle between the recognition vectors of the adjacent fingers; And determining the opening and closing states of the fingers according to the second included angle and a preset included angle threshold value.
  6. 6. The method of claim 1, wherein prior to the acquiring the spatial coordinates of each joint in the hand to be identified, the method further comprises: and carrying out parameter calibration on the hand to be identified according to a preset gesture calibration action, and determining an extremum state in the hand to be identified, wherein the extremum state at least comprises the maximum and minimum extension length of the hand to be identified, the maximum and minimum closing angle of each finger in the hand to be identified and the maximum and minimum bending degree of each finger in the hand to be identified.
  7. 7. The method of claim 1, wherein the triggering gesture recognition comprises: Determining a first position vector according to the spatial coordinates of the eye and the spatial coordinates of the wrist joint in the hand to be identified; Determining a second position vector according to the space coordinates of the wrist joint and the space coordinates of the palm joint in the hand to be identified; calculating a third included angle between the first position vector and the second position vector; Triggering gesture recognition under the condition that the third included angle is in a preset included angle range threshold value; after the calculating the third angle between the first position vector and the second position vector, the method further includes: and if the third included angle is not in the preset included angle threshold value, or if the hand to be recognized is not in the preset hand recognition area, exiting gesture recognition.
  8. 8. The method of claim 1, wherein the method is applied to interactions with AR devices.
  9. 9. A gesture recognition apparatus, the apparatus comprising: the coordinate acquisition module is used for acquiring the space coordinates of each joint in the hand to be identified under the condition of triggering gesture identification; The bending degree determining module is used for calculating a first included angle and a projection distance between a fingertip joint and a palm joint of each finger in a hand to be identified according to a pre-established projection coordinate system and the space coordinate, determining the bending degree of each finger according to the first included angle and the projection distance, wherein the projection coordinate system is established according to the palm joint and the wrist joint in the hand to be identified and is used for determining the relative position relationship between the fingertip joint and the palm joint of each finger, and the establishing process of the projection coordinate system comprises the steps of determining a coordinate system origin according to the palm joint in the hand to be identified, determining a first coordinate axis according to a first direction from the wrist joint to the palm joint in the hand to be identified, determining a second coordinate axis according to a straight line perpendicular to the first coordinate axis, and establishing a projection coordinate system according to the coordinate system origin, the first coordinate axis and the second coordinate axis; And the gesture recognition module is used for determining a gesture recognition result of the hand to be recognized according to the bending degree of each finger.
  10. 10. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 8 when the computer program is executed.

Description

Gesture recognition method and device and computer equipment Technical Field The disclosure relates to the field of gesture data processing, and in particular relates to a gesture recognition method, a gesture recognition device and computer equipment. Background With the development of the times and the iteration of hardware equipment, the human-computer interaction mode is more and more similar to the natural behavior of human users in reality from mouse click represented by windows desktop application to light point and sliding gesture represented by mobile equipment application to three-dimensional gesture and voice of MR/AR intelligent glasses represented by Hololens and the like to the current core, wherein gesture recognition is one of the most important and popular interaction behavior research directions at present. The conventional recognition method in the field of gesture recognition is mainly based on a sensor tracking scheme of wearable equipment and a deep learning model recognition scheme based on image recognition. However, the wearable device-based sensor tracking scheme can only recognize basic gestures, the recognized gesture types are limited, complex gestures cannot be recognized, and the image recognition-based deep learning model recognition scheme is huge in calculation amount and high in complexity. Disclosure of Invention In view of the foregoing, it is desirable to provide a gesture recognition method, apparatus, and computer device that can recognize a variety of complex gestures without requiring a large number of computations. In a first aspect, the present disclosure provides a gesture recognition method. The method comprises the following steps: under the condition of triggering gesture recognition, acquiring the space coordinates of each joint in the hand to be recognized; Calculating a first included angle and a projection distance between a fingertip joint and a palm joint of each finger in the hand to be identified according to a pre-established projection coordinate system and the space coordinate, and determining the bending degree of each finger according to the first included angle and the projection distance, wherein the projection coordinate system is established according to the palm joint and the wrist joint in the hand to be identified; and determining a gesture recognition result of the hand to be recognized according to the curvature of each finger. In one embodiment, the process for establishing the projection coordinate system includes: determining a coordinate system origin according to the palm joint in the hand to be identified; Determining a first coordinate axis according to a first direction from a wrist joint to a palm joint in the hand to be identified; Determining a second coordinate axis according to a straight line perpendicular to the first coordinate axis; And establishing a projection coordinate system according to the coordinate system origin, the first coordinate axis and the second coordinate axis. In one embodiment, the projection distance comprises a first projection distance and a second projection distance, and the calculating of the first included angle and the projection distance between the fingertip joint and the palm joint of each finger in the hand to be identified comprises the following steps: According to the space coordinates of the fingertip joints and the space coordinates of the palm joints of each finger, a first projection distance, a second projection distance and a first included angle are obtained; The first projection distance comprises the distance between the fingertip joint of each finger and a first coordinate axis in the projection coordinate system; the second projection distance comprises the distance between the fingertip joint of each finger and a second coordinate axis in the projection coordinate system; the first included angle comprises an included angle between a first vector and the second coordinate axis, wherein the first vector is a vector formed by a fingertip joint and the palm joint of each finger. In one embodiment, after the acquiring the spatial coordinates of each joint in the hand to be identified, the method further includes: Determining the flexion and extension states of all fingers and/or the opening and closing states among all fingers in the hand to be identified according to the space coordinates of all joints; determining a gesture recognition result of the hand to be recognized according to the bending and stretching states of the fingers and/or the opening and closing states of the fingers, or determining the gesture recognition result of the hand to be recognized according to the bending and stretching states of the fingers and/or the opening and closing states of the fingers and the bending degree; the bending and stretching states comprise a finger stretching state, a finger curling state, an intermediate state of stretching and curling and an unrecognizable state, and the opening