CN-121995905-A - Flight knapsack composite control system based on exoskeleton gestures
Abstract
The invention relates to an exoskeleton gesture-based flight backpack composite control system, which comprises a gesture data training module, a gesture recognition module, a gesture mapping module, a button instruction mapping module and a man-machine composite control module. The invention breaks through the single control limitation of the existing flight backpack, fuses the advantages of the exoskeleton structure and the data acquisition capability, and deeply couples the two, thereby realizing the high-precision, high-stability and high-real-time control of the flight backpack, solving the traditional control pain point, exerting the bearing and assisting functions of the exoskeleton and reducing the physical consumption of operators. Meanwhile, an exclusive gesture recognition model is built by means of the exoskeleton multidimensional sensor, so that the anti-jamming capability is improved, the method is more suitable for complex operation scenes, and the application boundary is widened.
Inventors
- XIE XUDONG
- WANG XINYUE
- LIANG WEIXI
- XIA YIFAN
- Yang Rirui
- YU LONG
- JIANG WEN
- LIU DINGYI
- TENG XUEFENG
- WU YAOBIN
- SONG XUAN
- HE HENG
- Huang Nenglang
- Dong Leqi
- WU CAIBIN
- Shan Zhiyang
Assignees
- 南昌航空大学
Dates
- Publication Date
- 20260508
- Application Date
- 20260210
Claims (10)
- 1. The flight backpack composite control system based on the exoskeleton gestures is characterized by comprising a gesture data training module, a gesture recognition module, a gesture instruction mapping module, a button instruction mapping module and a man-machine composite control module; The gesture data training module indirectly collects the relative position information of the fingers of an operator through a six-axis spatial position sensor arranged on the exoskeleton, and preprocesses the collected gesture data to form basic data for gesture training; The gesture recognition module is used for recognizing and analyzing the gesture of an operator based on the model constructed by the gesture data training module and outputting a corresponding gesture type; The gesture command mapping module is used for mapping the gesture type output by the gesture recognition module with a preset flight backpack control command to generate a control signal corresponding to the gesture; the button instruction mapping module is used for mapping the buttons arranged on the exoskeleton with a preset flight backpack control instruction to generate a corresponding control instruction signal; the man-machine composite control module is used for realizing the mutual conversion between gesture control instructions and physical key control instructions arranged on the exoskeleton, and realizing the composite control of the gesture control instructions and the physical key control instructions.
- 2. The backpack flight composite control system based on exoskeleton gestures of claim 1, wherein the gesture data training module comprises a data acquisition unit, a data training unit and a model construction unit; the data acquisition unit is used for acquiring the relative position information of the fingers and the hands of an operator; the data training unit is used for combining a large number of gesture samples and simulating and training gesture data; The model construction unit is used for establishing a gesture recognition model based on a U-Net deep learning network, the U-Net deep learning network adopts a symmetrical framework of a coder and a decoder, the symmetrical framework comprises the coder, a bottleneck layer and the decoder, the coder extracts gesture features through downsampling, the decoder restores feature dimensions through upsampling, and the shallow features of the coder and the deep features of the decoder are fused through jump connection.
- 3. The backpack flight composite control system based on exoskeleton gestures of claim 2, wherein the convolution characteristic calculation formula of the U-Net deep learning network is: (1) Wherein the method comprises the steps of In order to input the dimensions of the feature map, For the convolution kernel size, In order to obtain the number of charges, Is the step length; the encoder comprises four downsampling units, each downsampling unit comprises two 3×3 convolutions, batch normalization and The activation function is composed of a set of functions, The activation function formula is: (2) And the size of the feature map is halved and the number of channels is doubled through 2X 2 maximum pooling after each stage of downsampling unit; The decoder comprises four stages of up-sampling units, the feature dimension increase is realized by each stage of up-sampling unit through 2X 2 deconvolution, the deconvolution feature calculation formula is consistent with the convolution feature calculation formula, the corresponding layer features of the encoder and the current layer features of the decoder are spliced in the channel dimension through jump connection, and the number of channels after splicing meets the following conditions: (3) Wherein the method comprises the steps of For the number of output channels of the encoder, And finally mapping the number of the up-sampling channels to the dimension of the gesture category through 1×1 convolution to output a prediction probability value.
- 4. The backpack flight composite control system based on exoskeleton gestures of claim 2, wherein when the model building unit builds the gesture recognition model, a combined loss function of a Dice loss and a binary cross entropy loss is adopted, and the formula is: (4) Wherein the method comprises the steps of As for the true label of the gesture category, The probability is predicted for the model and, For the number of feature dimensions, Is a smooth term, preventing denominator from being zero; The model output layer adopts a Sigmoid activation function, and the formula is as follows: (5) normalizing the predicted value to the [0,1] interval, and outputting the probability value of each gesture category.
- 5. The backpack integrated control system based on exoskeleton gestures of claim 1, wherein the gesture recognition module comprises a real-time recognition unit and a recognition output unit; the real-time identification unit is used for inputting real-time gesture data acquired by the exoskeleton six-axis spatial position sensor into the gesture identification model to realize real-time identification of the current gesture of an operator; the recognition output unit is used for calculating gesture recognition confidence, and the formula is as follows: (6) Wherein the method comprises the steps of For the total number of gesture categories, when And outputting a category signal corresponding to the gesture.
- 6. The backpack control system based on the exoskeleton gestures is characterized by comprising a gesture command matching unit and a gesture command output unit, wherein the gesture command matching unit is used for matching preset backpack control commands according to the identified gesture types, and the gesture command output unit is used for converting the matched control commands into signals which can be identified by a backpack central control system and outputting backpack control command signals corresponding to the gestures.
- 7. The backpack control system based on the exoskeleton gestures is characterized by comprising a button command matching unit and a button command output unit, wherein the button command matching unit is used for matching preset backpack control commands according to trigger signals of physical exoskeleton buttons, command priorities corresponding to emergency control and function buttons are highest, and the button command output unit is used for converting the matched control commands into signals which can be identified by a backpack central control system and outputting backpack control command signals corresponding to the buttons.
- 8. The backpack composite control system based on exoskeleton gestures of claim 1, wherein the man-machine composite control module comprises a central control system, a key control unit, a gesture control unit and a mode conversion unit; The central control system is used for processing the transmitted gesture control signals and key control signals and driving the flight backpack to make corresponding operation; the key control unit is used for acquiring key input information of an operator and transmitting signals to the central control system; The gesture control unit is used for converting gesture control instructions into signals and transmitting the signals to the central control system; The mode conversion unit is a physical button arranged on the exoskeleton and is used for coordinating and switching the gesture control instruction and the key control instruction, so that three modes of gesture single control, key single control and compound combined control are converted.
- 9. The backpack composite control system based on the exoskeleton gestures, which is disclosed by claim 1, is characterized in that the six-axis spatial position sensor is distributed and carried at each joint of the exoskeleton finger, a triaxial acceleration and triaxial angular velocity sensing unit is integrated, and three-dimensional spatial relative position information of the hand is calculated through integral operation; The three-axis space displacement calculation formula of the six-axis space position sensor is as follows: (8) Wherein the method comprises the steps of Is that Axis acceleration, initial velocity and initial displacement are all 0 The joint space attitude angle solution formula is: (9) Wherein the method comprises the steps of Is wound around The angular speed of the shaft and the initial attitude angle are 0; Adjacent joint sensing node 、 The spatial relative position vector formula of (2) is: (10) the relative distance formula is: (11)。
- 10. The backpack flight composite control system based on exoskeleton gestures of claim 2, wherein the gesture data training is to pre-process data by using a Min-Max normalization method, and the formula is: (12) Wherein the method comprises the steps of In order to normalize the data it is, As the sensor raw position data, 、 Respectively the data maximum values; The preprocessed data set is divided into a training set, a verification set and a test set according to the proportion of 7:1:2, wherein the training set is used for model parameter learning, the verification set is used for super-parameter adjustment, the test set is used for model performance evaluation, the model training is followed by super-parameter configuration, optimal parameters are determined through Bayesian optimization, and the initial learning rate is determined And then performing optimizer training, wherein the training round threshold is 1000, the batch size is 16, and the parameters are updated by adopting an Adam optimizer, and the core formula is as follows: (13) Wherein the method comprises the steps of 、 , In order to lose the gradient, Finally, training is carried out, samples are input according to batches, forward propagation calculates loss and backward propagation is carried out on more parameters, each round of performance is evaluated by using a verification set, loss is recorded, and verification loss and current learning rate are synchronously monitored; model optimization is followed, firstly loss monitoring, and statistics verify that the loss is not reduced by turns If (if) Keeping the learning rate unchanged and continuing training, if The method comprises the steps of training the model, adjusting the learning rate to be half of the original training, stopping training and saving the optimal weight if the training round reaches 1000, finally, performing test set inspection through a trained network, returning to a preprocessing link for adjustment until the performance reaches the standard if the index is not reached, and outputting a final model for real-time identification, wherein the formula for core index verification is as follows: (14) Wherein the method comprises the steps of Is true positive, Is false positive, Model validation set for false negative 。
Description
Flight knapsack composite control system based on exoskeleton gestures Technical Field The invention relates to the technical field of flight backpack control, in particular to a flight backpack composite control system based on exoskeleton gestures. Background With the continuous development of personal flight equipment and man-machine interaction technology, the application demands of flight backpacks in the fields of emergency rescue, special operation and the like are gradually increased. People put forward higher requirements on the safety, stability and operation efficiency of the control mode of the flight backpack, and the application and development of the novel man-machine interaction technology in the field of flight backpack control are promoted. The existing control mode of the flight backpack is mostly dependent on traditional means such as a control lever, a button or gesture sensing and the like, the operation mode is single, the control freedom degree is limited, and the control mode is easily influenced by an operator state and external environment factors in a complex flight environment, so that the problems of insufficient control precision and poor stability exist. Meanwhile, partial gesture recognition control depends on vision or a single sensor, has limited anti-jamming capability, lacks an effective redundant control and guarantee mechanism, and is difficult to meet the control requirement of the backpack in practical application. In addition, the application of the existing exoskeleton in the control of the flight backpack is mainly stopped on the application of the exoskeleton, and the gesture data which can be acquired by the exoskeleton are not combined, so that a gesture recognition model is established. Therefore, the control problem of the flight backpack composite control system based on the exclusive adaptive design of the exoskeleton is established through hardware advantages and algorithm optimization of the exoskeleton. Disclosure of Invention Aiming at the defects of the prior art, the invention aims to provide a flight backpack composite control system based on exoskeleton gestures, so as to improve the recognition precision, stability and instantaneity in the flight backpack control process. The invention is realized by the following technical scheme. The invention provides a flight backpack composite control system based on exoskeleton gestures, which comprises a gesture data training module, a gesture recognition module, a gesture mapping module, a button instruction mapping module and a man-machine composite control module. The gesture data training module indirectly collects relative position information of fingers of an operator through a six-axis spatial position sensor arranged on the exoskeleton and preprocesses the collected gesture data to form basic data for gesture training. The gesture data training module comprises: The data acquisition unit is used for acquiring the relative position information of the fingers and the hands of an operator; The data training unit is used for combining a large number of gesture samples and simulating and training the gesture data; and the model building unit is used for building a gesture recognition model based on the U-Net deep learning network. The gesture recognition module is based on a gesture recognition model of the U-Net deep learning network, performs recognition analysis on gestures of an operator and outputs corresponding gesture types. The gesture recognition module comprises: The real-time identification unit is used for inputting real-time gesture data acquired by the exoskeleton into the gesture identification model to realize real-time identification of the current gesture of the operator. And the identification output unit is used for outputting a category signal corresponding to the gesture. The gesture command mapping module is used for mapping the gesture type output by the gesture recognition module with a preset flight backpack control command, so as to generate a control signal corresponding to the gesture. The gesture instruction mapping module comprises: the gesture instruction matching unit is used for matching corresponding flight backpack control instructions according to the identified gesture types; The gesture instruction output unit is used for outputting a backpack control instruction signal corresponding to the gesture. The button instruction mapping module is used for mapping the physical buttons arranged on the exoskeleton with a preset flight backpack control instruction, so as to generate a corresponding control instruction signal. The button instruction mapping module includes: The button instruction matching unit is used for matching corresponding flight backpack control instructions with the buttons; And the button instruction output unit is used for outputting a backpack control instruction signal corresponding to the button. The man-machine composite control module is used for r