CN-121982225-A - Human-computer interaction comfort assessment method and system guided by gesture framework
Abstract
The invention relates to the technical field of computer vision, in particular to a human-computer interaction comfort assessment method guided by an attitude framework and a system thereof, which comprises the steps of collecting an RGB image sequence and a depth image sequence of a working scene, generating time sequence three-dimensional framework data based on the RGB image sequence and the depth image, calculating a joint angle time sequence curve and an angular velocity time sequence curve in a single operation period based on the time sequence three-dimensional framework data, and counting the accumulated time of continuous deviation of a joint angle from an ergonomics neutral position range Constructing a human biomechanical model and calculating the joint mechanical work amount Constructing comfort evaluation function, and outputting local comfort score of target body part Based on local comfort scoring Generating a dynamic comfort thermodynamic diagram superimposed on the three-dimensional mannequin and generating an optimization suggestion report. The invention realizes non-contact and high-precision objective and quantitative evaluation of comfort level, can accurately identify fatigue factors and provides visual and operable optimization guidance.
Inventors
- WANG JINGLUAN
- WANG XUPENG
- LI ZE
Assignees
- 西安理工大学
Dates
- Publication Date
- 20260505
- Application Date
- 20260408
Claims (10)
- 1. The human-computer interaction comfort assessment method guided by the gesture framework is characterized by comprising the following steps of: S100, synchronously acquiring an RGB image sequence and a depth image sequence of a working scene through a depth camera, and performing time alignment; s200, carrying out human body instance segmentation and two-dimensional joint point detection based on the RGB image sequence, fusing depth images of corresponding frames, and mapping two-dimensional joint point coordinates into three-dimensional space coordinates under a camera coordinate system by utilizing camera calibration parameters to generate time sequence three-dimensional skeleton data; s300, aiming at a preset target joint, calculating a joint angle time sequence curve in a single operation period based on the time sequence three-dimensional skeleton data Time sequence curve with angular velocity And counting the cumulative time of the joint angle continuously deviating from the ergonomic neutral position range ; S400, constructing a human body biomechanical model, and making a joint angle time sequence curve Time sequence curve of angular velocity And inputting a reverse dynamics submodel to the mass center acceleration of the limb segment calculated based on the time sequence three-dimensional skeleton data, and solving the net moment of the target joint Will (i) be Inputting a muscle load distribution sub-model and solving the equivalent moment of main muscle groups of the driving joint And calculate the work done by the joint machinery ; S500, constructing a comfort degree evaluation function to obtain the extreme value of the joint angle Time of accumulation 、 Peak of (2) Outputting a local comfort score for the target body part for the input parameter The specific formula is as follows: ; Wherein, the Is a neutral position angle, which is a neutral position angle, In order to allow for a range of angles, In order to operate the duration of the cycle, And Respectively the equivalent moment of the main muscle group and the reference threshold value of the joint mechanical work load, 、 、 And For weighting coefficients, local comfort score And a larger value indicates a higher comfort level; s600, scoring based on local comfort Generating a dynamic comfort thermodynamic diagram overlapped on the three-dimensional human body model, and generating an optimization suggestion report; The single operation period is a complete time interval for executing a single repetitive human-computer interaction task, and the target joint is pre-designated according to the type of the operation task.
- 2. The gesture skeleton-guided human-computer interaction comfort evaluation method according to claim 1, wherein the specific step of S200 includes: S201, carrying out camera distortion correction on the RGB image, and carrying out median filtering denoising on the depth image; S202, processing the RGB image after distortion correction by using an instance segmentation neural network, outputting masks and confidence degrees of all human instances, and eliminating instances with confidence degrees lower than a first threshold value; S203, outputting two-dimensional coordinates of preset joint points and detection confidence coefficient by adopting a high-resolution gesture estimation network according to each reserved human body instance mask area; s204, inquiring a depth value of a corresponding position in the aligned depth image according to the two-dimensional coordinates, and if the depth value is invalid or the joint point detection confidence is lower than a third threshold, taking a weighted median of the effective depth value in a neighborhood centered on the two-dimensional coordinates as a compensation depth value, wherein the weight is inversely proportional to Euclidean distance from a neighborhood pixel to a center point; S205, based on the camera internal reference matrix, back-projecting the two-dimensional coordinates and the corresponding depth values to a camera coordinate system to obtain three-dimensional joint point coordinates: ; Wherein, the For the image coordinates of the ith node of interest, For the depth value corresponding to the image coordinates, As an internal reference matrix of the camera, 、 As the focal length of the lens is, Is the principal point coordinates; S206, performing time sequence smoothing on the three-dimensional coordinate sequence of the same joint point in the continuous frames by applying Kalman filtering, and complementing the joint point track continuously shielded by more than the preset frame number by adopting an interpolation algorithm based on kinematic constraint; S207, packaging the three-dimensional joint point coordinates after the smooth complementation into structured data comprising a time stamp, an instance identifier and joint point coordinates according to a frame sequence, and generating time sequence three-dimensional skeleton data.
- 3. The gesture skeleton-guided human-computer interaction comfort evaluation method of claim 1, wherein the calculation of the mass center acceleration of the limb segment comprises determining a position coordinate sequence of the mass center of the limb segment in the three-dimensional skeleton according to the anthropometric proportional parameter, and performing time domain numerical differentiation on the position coordinate sequence to obtain a mass center acceleration vector.
- 4. The human-computer interaction comfort assessment method guided by the gesture framework according to claim 1, wherein the single operation period is determined through Hilbert-Huang transformation analysis of a joint angle time sequence curve, when the standard deviation of the period length of three continuous periods is less than 3%, an average value is taken as the operation period length, and when the fluctuation exceeds 15%, manual review is triggered; the joint angle time sequence curve Calculated differently according to the joint type, wherein the joint type comprises a hinge joint and a ball joint, and the angular velocity time sequence curve By aligning Filtering with preset window length, calculating by central difference method, and accumulating deviation time By traversing a single period And counting the time sum of the joint angle exceeding the preset neutral position range.
- 5. The human-computer interaction comfort assessment method guided by the gesture framework according to claim 1, wherein the construction of the human biomechanical model is based on the height and the weight of a measured object, mass center position and moment of inertia of each limb segment are determined by using Winter anthropometric criteria, and a 14-rigid-body-segment multi-rigid-body kinetic model is constructed; solving the inverse dynamics submodel adopts Newton-Euler recursion algorithm, and calculates from the far end of the limb to the near end section by section to solve the net moment of the target joint Will be The specific formula is as follows: ( , ; Wherein, the For the force of the kth muscle, For the maximum isometric force of the kth muscle, Is the transpose of the moment arm vector, Is the total number of muscles involved in the movement of joint j.
- 6. The human-computer interaction comfort assessment method guided by the gesture skeleton according to claim 1, wherein the specific formula for calculating the joint mechanical work amount in step S500 is as follows: ; Wherein, the And The start and stop times of the operation cycle are respectively.
- 7. Gesture skeleton-guided human-computer interaction comfort level evaluation system, characterized by comprising: The data acquisition module is used for synchronously acquiring an RGB image sequence and a depth image sequence of the operation scene through the depth camera and performing time alignment; The three-dimensional skeleton generation module is used for carrying out human body instance segmentation and two-dimensional joint point detection based on the RGB image sequence, fusing depth images of corresponding frames, mapping two-dimensional joint point coordinates into three-dimensional space coordinates under a camera coordinate system by utilizing camera calibration parameters, and generating time sequence three-dimensional skeleton data; The joint motion analysis module is used for calculating a joint angle time sequence curve in a single operation period based on the time sequence three-dimensional skeleton data aiming at a preset target joint Time sequence curve with angular velocity And counting the cumulative time of the joint angle continuously deviating from the ergonomic neutral position range ; The biomechanical analysis module is used for constructing a human biomechanical model and making a joint angle time sequence curve Time sequence curve of angular velocity And inputting a reverse dynamics submodel to the mass center acceleration of the limb segment calculated based on the time sequence three-dimensional skeleton data, and solving the net moment of the target joint Will (i) be Inputting a muscle load distribution sub-model and solving the equivalent moment of main muscle groups of the driving joint And calculate the work done by the joint machinery ; A comfort level evaluation module for constructing a comfort level evaluation function to limit the joint angle Time of accumulation 、 Peak of (2) Outputting a local comfort score for the target body part for the input parameter ; A visualization and report generation module for local comfort-based scoring Generating a dynamic comfort thermodynamic diagram superimposed on the three-dimensional mannequin and generating an optimization suggestion report.
- 8. The gesture skeleton-guided human-computer interaction comfort evaluation system of claim 7, wherein the three-dimensional skeleton generation module specifically comprises: the preprocessing unit is used for carrying out camera distortion correction on the RGB image, and carrying out median filtering denoising on the depth image; The example segmentation unit is used for processing the RGB image after distortion correction by using an example segmentation neural network, outputting masks and confidence degrees of all human examples, and eliminating the examples with the confidence degrees lower than a first threshold value; The two-dimensional joint point detection unit is used for outputting two-dimensional coordinates of preset joint points and detection confidence coefficient by adopting a high-resolution gesture estimation network for each reserved human body instance mask area; If the depth value is invalid or the joint point detection confidence is lower than a third threshold, taking the weighted median of the effective depth value in the neighborhood centered on the two-dimensional coordinate as a compensation depth value, wherein the weight is inversely proportional to Euclidean distance from the neighborhood pixel to the center point; The three-dimensional back projection unit is used for back projecting the two-dimensional coordinates and the corresponding depth values to a camera coordinate system based on the camera internal reference matrix to obtain three-dimensional joint point coordinates; the time sequence processing unit is used for performing time sequence smoothing on the three-dimensional coordinate sequence of the same joint point in the continuous frames by applying Kalman filtering, and complementing the joint point track which is continuously shielded by more than the preset frame number by adopting an interpolation algorithm based on kinematic constraint; and the data packaging unit is used for packaging the three-dimensional joint point coordinates after the smooth complementation into structured data comprising a time stamp, an instance identifier and joint point coordinates according to a frame sequence, and generating time sequence three-dimensional skeleton data.
- 9. A computer device comprising a memory and a processor, wherein the memory stores a computer program, the processor implementing the gesture skeleton-guided human-computer interaction comfort assessment method of any one of claims 1 to 6 when the computer program is executed.
- 10. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when executed by a processor, implements the gesture skeleton-guided human-computer interaction comfort evaluation method of any one of claims 1 to 6.
Description
Human-computer interaction comfort assessment method and system guided by gesture framework Technical Field The invention relates to the technical field of computer vision, in particular to a human-computer interaction comfort assessment method guided by a gesture framework and a system thereof. Background The human-computer interaction and the human engineering are deeply fused in the fields of industrial manufacture, rehabilitation medical treatment, virtual reality and the like, higher requirements are provided for the comfort level and the operation health of operators, the traditional human-computer interaction comfort level assessment is dependent on subjective questionnaire survey, expert experience judgment or simple action video playback, objective, quantitative and real-time assessment is difficult to realize, along with the development of computer vision and sensor technology, a human body posture estimation method based on vision is gradually applied to action analysis and work efficiency assessment, and possibility is provided for realizing non-contact and automatic comfort level assessment, however, the existing method is concentrated on two-dimensional posture analysis or static posture assessment, and lacks of deep modeling of three-dimensional motion time sequence characteristics and internal loads of human biomechanics, so that deviation exists between an assessment result and real physiological feeling, and accumulated fatigue and discomfort under long-time and repeated operation are difficult to comprehensively reflect. In the prior art, partial research attempts are made to acquire motion and muscle activity data through a wearable sensor, but the more direct biomechanical information can be acquired, but the equipment is complicated to wear, easy to interfere with normal operation and high in cost and is difficult to apply in a large scale in a real production environment, and the other type of vision method based on a monocular RGB camera is limited by the problems of lack of depth information, weak shielding processing capacity, insufficient three-dimensional reconstruction precision and the like although the non-contact evaluation is realized, especially the robustness in complex background and multi-person scenes, and in addition, most evaluation models only pay attention to the kinematic parameters such as joint angle, action frequency and the like, do not effectively combine biomechanical mechanisms such as reverse dynamics, muscle force distribution and the like, and key fatigue factors such as joint moment, muscle load and mechanical work cannot be accurately reflected, so that the evaluation dimension is single and the interpretation is limited. Therefore, a method for evaluating the comfort level by combining high-precision three-dimensional gesture reconstruction, time sequence motion analysis and multi-rigid body biomechanics simulation is needed, objective, accurate and interpretable evaluation of the local joint load and comfort level of a human body is realized under a non-contact and natural operation state, and a quantitative basis is provided for operation gesture optimization and man-machine interface design. Disclosure of Invention Aiming at the defects in the prior art, the invention provides a human-computer interaction comfort assessment method guided by a gesture framework and a system thereof, which can effectively solve the problems in the prior art. In order to achieve the above purpose, the invention is realized by the following technical scheme: the invention provides a human-computer interaction comfort assessment method guided by a gesture framework, which comprises the following steps of: S100, synchronously acquiring an RGB image sequence and a depth image sequence of a working scene through a depth camera, and performing time alignment; s200, carrying out human body instance segmentation and two-dimensional joint point detection based on the RGB image sequence, fusing depth images of corresponding frames, and mapping two-dimensional joint point coordinates into three-dimensional space coordinates under a camera coordinate system by utilizing camera calibration parameters to generate time sequence three-dimensional skeleton data; s300, aiming at a preset target joint, calculating a joint angle time sequence curve in a single operation period based on the time sequence three-dimensional skeleton data Time sequence curve with angular velocityAnd counting the cumulative time of the joint angle continuously deviating from the ergonomic neutral position range; S400, constructing a human body biomechanical model, and making a joint angle time sequence curveTime sequence curve of angular velocityAnd inputting a reverse dynamics submodel to the mass center acceleration of the limb segment calculated based on the time sequence three-dimensional skeleton data, and solving the net moment of the target jointWill (i) beInputting a muscle load distribution sub-model